Artificial Neural Network for PHP
project
https://ann.thwien.de/w/Main_Page
MediaWiki 1.39.13
first-letter
Media
Special
Talk
User
User talk
Artificial Neural Network for PHP
Artificial Neural Network for PHP talk
File
File talk
MediaWiki
MediaWiki talk
Template
Template talk
Help
Help talk
Category
Category talk
Main Page
0
1
1
2007-12-17T16:59:57Z
MediaWiki default
0
wikitext
text/x-wiki
<big>'''MediaWiki has been successfully installed.'''</big>
Consult the [http://meta.wikimedia.org/wiki/Help:Contents User's Guide] for information on using the wiki software.
== Getting started ==
* [http://www.mediawiki.org/wiki/Manual:Configuration_settings Configuration settings list]
* [http://www.mediawiki.org/wiki/Manual:FAQ MediaWiki FAQ]
* [http://lists.wikimedia.org/mailman/listinfo/mediawiki-announce MediaWiki release mailing list]
08fc850f2898611c250d639e30f69532b5a016f8
2
1
2007-12-17T18:03:03Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several changes on this implementation are done by ''Thomas Wien'' in 2007.
----
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007)
* PHP 5.x support
* Momentum
* linear / binary Output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* tangens hyperbolicus transfer function
* several performance issues
* avoiding array_keys & srand
* changes in saving and loading network
* Printing network details to browser
* fixing bug: initializing inputs to all hidden layers
* fixing bug: training for first hidden layer was skipped
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
1a8e897102661a472cbbd68265e3f8116542f22d
3
2
2007-12-17T18:04:34Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several changes on this implementation are done by ''Thomas Wien'' in 2007.
----
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007)
* PHP 5.x support
* Momentum
* linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* tangens hyperbolicus transfer function
* several performance issues
* avoiding array_keys & srand
* changes in saving and loading network
* Printing network details to browser
* fixing bug: initializing inputs to all hidden layers
* fixing bug: training for first hidden layer was skipped
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
9aa4f77d42499c8fa8d59d95282e5c30466b1ab4
8
3
2007-12-17T19:08:01Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several changes on this implementation are done by ''Thomas Wien'' in 2007.
----
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
33c6fb1a332cbbad349cd0fd753d82b0c3fb977a
10
8
2007-12-17T20:03:21Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007.
----
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
82a3d00f1a0fe6d660577cc059a3711225fed970
11
10
2007-12-17T20:44:00Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007.
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
6c45a9c0a12322eebdee4b7b7bf4a2a3ecf469ff
15
11
2007-12-17T21:10:30Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007.
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
e984015d110330a8a4d24a04b0f687ffab1e6675
16
15
2007-12-17T21:12:20Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
54ffff850253784cc27f7472cf3abb6a16e10d06
17
16
2007-12-17T21:13:49Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
f5949396d55fa94d1b2993bee7adb6c0b5d9fae2
18
17
2007-12-17T21:15:59Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys & srand
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
888a237bc0959764089e4577c025bae98a8407de
19
18
2007-12-17T21:16:58Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
fd0851d6c62feed5d11c86832307d0aa11c2466a
21
19
2007-12-18T16:47:01Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Download]]
* [[Installation]]
* [[Copyright]]
5178947adae4ce2c9ff71b80f8af999ec4ad6118
22
21
2007-12-19T14:13:33Z
Thwien
2
/* Overview */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Copyright]]
7555ae3e4d0bac2a8ce7b8af8c705c39041f93e2
28
22
2007-12-21T08:49:05Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Copyright]]
2bcb9c005fab262e10647d86c1a9ac01e57ffc57
32
28
2007-12-21T09:51:33Z
Thwien
2
/* Overview */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
8566fed3fdab60e349ae2e8795427c1f50cf03b6
39
32
2007-12-21T10:15:37Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
a70d1c7a3b0f4b91abe6f901e183fa2834013eba
42
39
2007-12-21T10:51:30Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
432323f3c954df38d694c6c80081a030f9a69b32
Download
0
2
4
2007-12-17T18:20:15Z
Thwien
2
New page: <big>'''ANN - Artificial Neural Network for PHP 5.x'''</big> Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installatio...
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version 2.0.0 (2007-12-17) '''''stable''''' ==
* [http://ann.thwien.de/downloads/ann200.tar.gz Download zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download tar.gz]
== Version 1.0 (2002) ==
[http://freebsd.mu/freebsd/archives/000039.html Project page]
676558f5f6506ae1a0f73f066ee44087571c922f
5
4
2007-12-17T18:21:52Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version 2.0.0 (2007-12-17) '''''stable''''' ==
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
== Version 1.0 (2002) ==
[http://freebsd.mu/freebsd/archives/000039.html Project page]
d9cd8fbad572f2a73c45deb20ccaab952ccde172
9
5
2007-12-17T19:08:10Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
Author: Thomas Wien
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
== Version '''1.0''' (2002) ==
Author: Eddy Young
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
de48058d4fbd6104f3e8a54d7dcb2cf4270ae6fc
20
9
2007-12-17T21:37:02Z
Thwien
2
/* Version '''2.0.0''' (2007-12-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
Author: Thomas Wien
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
Documentation (online)
* [http://ann.thwien.de/phpdoc/ Dokumentation (HTML online)]
== Version '''1.0''' (2002) ==
Author: Eddy Young
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
704e1d28fc4471bebd7001ac7bbe5913411610fc
40
20
2007-12-21T10:29:04Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Dokumentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
5828519fac453d15b08932eb44712d2d89819f64
43
40
2007-12-21T10:51:46Z
Thwien
2
/* Version '''2.0.0''' (2007-12-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Dokumentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
08c88465619746fcf24ad7541eab4e4b9c4ef862
Installation
0
3
6
2007-12-17T19:04:08Z
Thwien
2
New page: <big>'''ANN - Artificial Neural Network for PHP 5.x'''</big> This chapter describes the steps to implement the ANN source code to your project. == Installation == * Unpack the source c...
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
tar -xzf ann200.tar.gz
* Including to your source
<code>
<?php<br>
require_once ('ANN/ANN_Network.php');<br>
$ann = new ANN_Network;<br>
?><br>
</code>
<div style="background-color: #eeeeee; font-familiy: Courier">Links flutender Blocksatz</div>
b20048a88d2c491743490fbf1fcdc3c720b4ed1a
7
6
2007-12-17T19:07:24Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
tar -xzf ann200.tar.gz
* Including to your source
<code>
<b><?php</b><br /> <br />
<b>require_once</b> ('ANN/ANN_Network.php');<br>
$ann = <b>new</b> ANN_Network;<br /> <br />
<b>?></b><br>
</code>
23343c5b2b0e4d80b7cdcdc6cba22336b765ad88
14
7
2007-12-17T21:05:30Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<code>
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</code>
ac89d0a0c75a9f9d6b7559bf6c91ea7999ec7db4
24
14
2007-12-19T22:53:12Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<b><?php</b><br />
<b>require_once</b> ('ANN/ANN_Network.php');<br />
$ann = <b>new</b> ANN_Network;<br />
<b>?></b>
32ea4cd0d591738c8bf757acd6843078c03eefb9
25
24
2007-12-19T22:56:08Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
>tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<b><?php</b><br />
<b>require_once</b> ('ANN/ANN_Network.php');<br />
$ann = <b>new</b> ANN_Network;<br />
<b>?></b>
edc74b2985816ecc11f88918c54dc5db24dd515b
26
25
2007-12-19T23:47:44Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
>tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
be166f02c040e447c8de847d5a901fa17462cddd
36
26
2007-12-21T10:02:07Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* Unpack the source code
>tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
fbfb5a8a35a4efc36f032285484a92d0a98c22cb
49
36
2007-12-21T12:25:23Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann200.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
8709908ae735fba052c91ca16b28adea1063b14c
Copyright
0
4
12
2007-12-17T20:55:15Z
Thwien
2
New page: The copyright conditions are included in the source files. <code> /** * Artificial Neural Network - Version 2.0.0 * * For updates and changes visit the project page at http://ann.thwie...
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<code>
/**
* Artificial Neural Network - Version 2.0.0
*
* For updates and changes visit the project page at http://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* This source file is freely re-distributable, with or without modifications
* provided the following conditions are met:
*
* 1. The source files must retain the copyright notice below, this list of
* conditions and the following disclaimer.
*
* 2. The name of the author must not be used to endorse or promote products
* derived from this source file without prior written permission. For
* written permission, please contact me.
*
* <b>DISCLAIMER</b>
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR `AS IS'' AND
* ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
* PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE PHP
* AUTHOR OR HIS CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version 2.0 by Thomas Wien
* @copyright Copyright (c) 2002 Eddy Young
* @copyright Copyright (c) 2007 Thomas Wien
* @package ANN
*/
</code>
a76b88143bfdc173973ff33848b57e996ef698a5
Artificial Neural Network for PHP:General disclaimer
4
5
13
2007-12-17T20:59:58Z
Thwien
2
New page: [http://thwien.de/internet/sites/service.php Disclaimer / Impressum]
wikitext
text/x-wiki
[http://thwien.de/internet/sites/service.php Disclaimer / Impressum]
eea8f1da35d841516c5fdf605fb4a0e145e29ae0
Neural Networks
0
6
23
2007-12-19T14:47:39Z
Thwien
2
New page: == Information about Neural Networks == * [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)] * [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer ...
wikitext
text/x-wiki
== Information about Neural Networks ==
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
1a13e49ca11f2d7194c4d431e15968c32d0c2b2e
27
23
2007-12-20T19:00:43Z
Thwien
2
/* Information about Neural Networks */
wikitext
text/x-wiki
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
06aab2320ed67a65e3448ad6e4b82972379bdb2e
29
27
2007-12-21T09:30:49Z
Thwien
2
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is an neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
23ebac5caefc440283ce8cab4b0a6b64704f6261
30
29
2007-12-21T09:36:09Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
1a0a64fd140b69ba297555285df883fb8bf391cb
31
30
2007-12-21T09:48:06Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict or recognize known or unknown inputs to their related outputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
2d5656650db1b45496fc91f3cdeacfb563f4852c
41
31
2007-12-21T10:47:55Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict or recognize known or unknown inputs to their related outputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has a neural network a consciousness?'''
The one and only answer can be: '''No'''. How consciousness is coming to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
68c83cd7da2b169fe71b7e658c48ef50bd75fbd8
46
41
2007-12-21T11:38:57Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A general classification can be done also with a neural network. For example such a network can say from an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has a neural network a consciousness?'''
The one and only answer can be: '''No'''. How consciousness is coming to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
65f3b76d30f6f91f1618727f42a6d67648e43246
Examples
0
7
33
2007-12-21T09:52:54Z
Thwien
2
New page: == Examples == * [[logical XOR function]]
wikitext
text/x-wiki
== Examples ==
* [[logical XOR function]]
f486e155bd174c91e8df16c54d3593d40824593c
38
33
2007-12-21T10:07:25Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
* [[logical XOR function]]
* [[logical OR function]]
* [[logical AND function]]
== Prediction ==
* [[Selling Icecreams]]
5d807d142be6de43ea4a19448ceb69f006ac4fff
44
38
2007-12-21T11:03:20Z
Thwien
2
/* Logical Functions */
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use.
* [[logical XOR function]]
* [[logical OR function]]
* [[logical AND function]]
== Prediction ==
* [[Selling Icecreams]]
8e61da109b93a5bdabf843e62dadab079b3fb109
45
44
2007-12-21T11:04:23Z
Thwien
2
/* Prediction */
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use.
* [[logical XOR function]]
* [[logical OR function]]
* [[logical AND function]]
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
52d365259e8f8a0d054059946be254f30c953f33
48
45
2007-12-21T12:19:47Z
Thwien
2
/* Logical Functions */
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
59c66fb962a74519a2e837f1ecc3068b6c122e6a
Logical XOR function
0
8
34
2007-12-21T09:59:37Z
Thwien
2
New page: = Logical XOR function = == Training == <source lang="php"> require_once '../ANN/ANN_Network.php'; try { $network = ANN_Network::loadFromFile('xor.dat'); } catch(Exception $e) { pri...
wikitext
text/x-wiki
= Logical XOR function =
== Training ==
<source lang="php">
require_once '../ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "\nCreating a new one...";
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using a trained network ==
<source lang="php">
require_once('../ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "\nNetwork not found.";
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
c1f47f98b922deaea5bd6c30ed662034fa721735
35
34
2007-12-21T10:00:31Z
Thwien
2
wikitext
text/x-wiki
= Logical XOR function =
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "\nCreating a new one...";
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using a trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "\nNetwork not found.";
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
09598c626356ff73101108458a07158fcd73e8ce
37
35
2007-12-21T10:05:09Z
Thwien
2
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "Creating a new one...";
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using a trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print "Network not found.";
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
25f0e322407df89bbdab34351df77e792da85843
Selling Icecreams
0
9
47
2007-12-21T12:14:35Z
Thwien
2
New page: == Training == <source lang="php"> require_once 'ANN/ANN_Network.php'; try { $network = ANN_Network::loadFromFile('icecreams.dat'); } catch(Exception $e) { print "Creating a new one...
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print "Creating a new one...";
$network = new ANN_Network(2,8,1);
}
$temperature = new ANN_InputValue(-15, 50); // Temperature
$humidity = new ANN_InputValue(0, 100); // Humidity
$icecream = new ANN_OutputValue(0, 300); // Ice-Cream
$inputs = array(
array($temperature->GetInputValue(20), $humidity->GetInputValue(10)),
array($temperature->GetInputValue(30), $humidity->GetInputValue(40)),
array($temperature->GetInputValue(32), $humidity->GetInputValue(30)),
array($temperature->GetInputValue(33), $humidity->GetInputValue(20))
);
$outputs = array(
array($icecream->GetOutputValue(20)),
array($icecream->GetOutputValue(90)),
array($icecream->GetOutputValue(70)),
array($icecream->GetOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
477b62b3b540cfd3cd978eaa6c8ca2fce4e29916
50
47
2007-12-21T12:48:52Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print "Creating a new one...";
$network = new ANN_Network(2,8,1);
}
$temperature = new ANN_InputValue(-15, 50); // Temperature
$humidity = new ANN_InputValue(0, 100); // Humidity
$icecream = new ANN_OutputValue(0, 300); // Ice-Cream
$inputs = array(
array($temperature->GetInputValue(20), $humidity->GetInputValue(10)),
array($temperature->GetInputValue(30), $humidity->GetInputValue(40)),
array($temperature->GetInputValue(32), $humidity->GetInputValue(30)),
array($temperature->GetInputValue(33), $humidity->GetInputValue(20))
);
$outputs = array(
array($icecream->GetOutputValue(20)),
array($icecream->GetOutputValue(90)),
array($icecream->GetOutputValue(70)),
array($icecream->GetOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
882ddda2234f211b63e180b2234fb6a604d8a475
Neural Networks
0
6
51
46
2007-12-21T13:26:46Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A general classification can be done also with a neural network. For example such a network can say from an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network a consciousness?'''
The one and only answer can be: '''No'''. How consciousness is coming to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
3448a2cf0429def4fd7b0a99568f167bcf7b50c4
58
51
2007-12-21T19:27:13Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the local electricity factory in Duesseldorf, Germany is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction how many articles of a product in a supermarket will be sold in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to predict the among of daily calls in a call centre to plan how many co-workers have to work that day. Or the German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A general classification can be done also with a neural network. For example such a network can say from an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
d344d1f558993cbcbc87f431596c1c320365e014
64
58
2007-12-31T09:58:00Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using an multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, day, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to calculate much better the use of the articles ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can say from an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
d0945da5dba8d5c71917b5f1da4bd57be494e11e
66
64
2007-12-31T11:51:41Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using a multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, weekday, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to optimize ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions. Also neural networks are used to '''detect spam mails''' in mail clients or on mail servers.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
9494bdbc26c31922f128f037bcd27bdacced2d50
67
66
2007-12-31T12:05:19Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using a multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, weekday, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to optimize ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions. Also neural networks are used to '''detect spam mails''' in mail clients or on mail servers.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: multilayer perceptron, self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
c6294a3f179d125861bac20326756c6995df7cd8
95
67
2008-01-09T19:05:26Z
Thwien
2
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using a multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, weekday, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to optimize ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions. Also neural networks are used to '''detect spam mails''' in mail clients or on mail servers. Also in the this areas neural networks are used: '''Stock market, credit assignments, air traffic control, robot control, game strategy control, noise tolerance of analogue modems, scheduling buses and trams, air planes and elevators, optimization of traffic flows, weather forecast.'''
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: multilayer perceptron, self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
5d024825f45f30470919cae89402b85b35fbc8e1
96
95
2008-01-09T19:07:37Z
Thwien
2
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using a multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, weekday, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to optimize ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions. Also neural networks are used to '''detect spam mails''' in mail clients or on mail servers. Also in the this areas neural networks are used: '''Stock market, credit assignments, air traffic control, robot control, game strategy control, noise tolerance of analogue modems, scheduling buses and trams, air planes and elevators, optimization of traffic flows, weather forecast.'''
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: multilayer perceptron, self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
bb7f57f36db707ac6c94c465202136a9bb8cc683
97
96
2008-01-09T19:08:40Z
Thwien
2
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
There are a few very interesting uses of artificial neural networks. For example, the '''local electricity factory in Duesseldorf, Germany''' is using a multilayer perceptron for daily prediction of power use in the city referring to temperature, humidity, weekday, etc. Another example is the prediction '''how many articles of a product in a supermarket will be sold''' in one week. With this information it is possible to optimize ordering and storage. Or such a network is used to '''predict the among of daily calls in a call centre''' to plan how many co-workers have to work that day. Or the German Post is using neural networks in '''recognition of post codes (PLZ)''' written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again. Artificial neural networks are also used to '''find a numeric solution of difficult mathematical functions'''. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule. A '''general classification''' can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on. Another example is a '''network router which learns the fastest routing''' of internet packets. Therefore this router can optimize routing decisions. Also neural networks are used to '''detect spam mails''' in mail clients or on mail servers. Also in the this areas neural networks are used: '''Stock market, credit assignments, air traffic control, robot control, game strategy control, noise tolerance of analogue modems, scheduling buses and trams, air planes and elevators, optimization of traffic flows, weather forecast, music composition.'''
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: multilayer perceptron, self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
91c1d19f6b3ba6dbdf1915cb1673190d2deda84d
98
97
2008-01-10T10:24:12Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a multilayer perceptron topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a multilayer perceptron for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: multilayer perceptron, self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
46f1cc66e5121ee842df6375ce13e6d54c992bd4
Main Page
0
1
52
42
2007-12-21T13:46:28Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Separation of classes to several files
* Graphical output of neural network values
* Examples
* Version control by Subversion
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
c36b703211273e14562c79731bf3e29950175bde
59
52
2007-12-21T21:53:06Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Todo ==
* Graphical output of neural network values
* Examples
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
3295cb804bb9bd1c9adb857a62c66bbb4327a821
60
59
2007-12-21T21:57:27Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
== Todo ==
* Graphical output of neural network values
* Examples
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
aa1c9fd529e3164c3eb38f86f5c4e2ddb3e16de3
61
60
2007-12-23T14:13:02Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network values
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
74083ea8b22de37e9156886e47f76149a56dcfab
72
61
2007-12-31T13:05:10Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network values
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
dc94d547d8eb78fa649862c781fc15ea67db47ce
73
72
2007-12-31T13:05:33Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
2adaddab612fb5f217367c0661fc61743cb18c31
74
73
2007-12-31T13:06:36Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
2b662b46c08b6012e8a479c88a89f1fbd8a38f5d
78
74
2007-12-31T14:10:10Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (17.12.2007) [[Download]]
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien (Development)'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
b0765291adc90cee36a549825f064687351d2d7b
80
78
2008-01-06T16:28:02Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Wiki: More details to installation and use
* Wiki: Project specific logo
* PHPDoc: More details to documentation
0526b912a370bc19a6d5e02368c009f9988ac6fb
89
80
2008-01-06T20:51:12Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
d281a583ce8135c866f2432271b728c7c37f02b0
94
89
2008-01-07T22:07:12Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
ffa72a37e4266d8f5cbd3e522ef47a4f3b2c704a
99
94
2008-01-11T12:49:41Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''multilayer perceptron'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
efc7873df5158ed576f863fea940b91360fe599d
100
99
2008-01-13T11:40:12Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
6db217d583bf1058f72f91c12c79ef711073dee5
Logical XOR function
0
8
53
37
2007-12-21T13:50:19Z
Thwien
2
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using a trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
21060ce1b70e735b47811a3aa9d9750e6a169c7e
54
53
2007-12-21T13:50:52Z
Thwien
2
/* Using a trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using the trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
b82be37d0029987ef6d8ca838d4c423a75cb088b
56
54
2007-12-21T14:16:52Z
Thwien
2
/* Using the trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using the trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
21da8da15c7e0c17cd62cef323b5c685727f3918
57
56
2007-12-21T14:17:16Z
Thwien
2
/* Using the trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once('ANN/ANN_Network.php');
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
3731323bd7db1494550232a0f903f0a25e734a89
77
57
2007-12-31T13:43:31Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
c45b8bd99112c3b65e5b7398e05c1093e2146743
Selling Icecreams
0
9
55
50
2007-12-21T13:54:33Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
}
$temperature = new ANN_InputValue(-15, 50); // Temperature
$humidity = new ANN_InputValue(0, 100); // Humidity
$icecream = new ANN_OutputValue(0, 300); // Ice-Cream
$inputs = array(
array($temperature->GetInputValue(20), $humidity->GetInputValue(10)),
array($temperature->GetInputValue(30), $humidity->GetInputValue(40)),
array($temperature->GetInputValue(32), $humidity->GetInputValue(30)),
array($temperature->GetInputValue(33), $humidity->GetInputValue(20))
);
$outputs = array(
array($icecream->GetOutputValue(20)),
array($icecream->GetOutputValue(90)),
array($icecream->GetOutputValue(70)),
array($icecream->GetOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
cacd694a4bfef02fb2ebb97749d44b7d7f10316b
62
55
2007-12-31T09:24:44Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
}
$temperature = new ANN_InputValue(-15, 50); // Temperature
$humidity = new ANN_InputValue(0, 100); // Humidity
$icecream = new ANN_OutputValue(0, 300); // Ice-Creams
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
eb1762f29991c4b9ef25aad4083b18864d8e4ffb
68
62
2007-12-31T12:25:03Z
Thwien
2
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
$temperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$temperature->saveToFile('input_temperature.dat');
unset($temperature);
$humidity = new ANN_InputValue(0, 100); // Humidity percentage
$humidity->saveToFile('input_humidity.dat');
unset($humidity);
$icecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$icecream->saveToFile('output_quantity.dat');
unset($icecream);
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
f5022fb86e21e1088d6a4d2e27b5b4fa821356ef
Download
0
2
63
43
2007-12-31T09:31:52Z
Thwien
2
/* Version '''2.0.0''' (2007-12-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip]
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz]
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
c6093d1ead731a7f01bb948e71b6c06abf69adbd
65
63
2007-12-31T10:59:12Z
Thwien
2
/* Version '''2.0.0''' (2007-12-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
e901d4428ba2de815b677026fe2bb5c620c8e7dd
69
65
2007-12-31T12:36:58Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running PHP 5.2.x or above.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
9b9a2fe31bb911fef7c235fff9ca266d9d6e165a
70
69
2007-12-31T13:01:34Z
Thwien
2
/* Requirements */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand()
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
49c31f5ed62c1367916fc7503d1191ff7d283110
71
70
2007-12-31T13:04:57Z
Thwien
2
/* Version '''2.0.0''' (2007-12-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is version 2.0.0 released at 17th December, 2007. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above.
== Version '''2.0.0''' (2007-12-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
1971e8b0fc425284636293afa55d79d52a776bee
79
71
2008-01-06T16:24:33Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.1'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above.
== Version '''2.0.1''' (2008-01-06) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
c46e3a0ce01f01aa98ba10239adf90bf3560ad51
Examples
0
7
75
48
2007-12-31T13:10:53Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* Daily power consumption
== Optimization ==
* Internet routing decision
513856665c5437fc1c2a93498e205d754e810cfe
76
75
2007-12-31T13:11:27Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
== Optimization ==
* [[Internet routing decision]]
6f66ae8c5086aca187390087d54b0e8f0873acd5
82
76
2008-01-06T16:38:39Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
c5c7cdd56aa56e4b1340b03d30a792faac07c1ea
92
82
2008-01-06T21:56:36Z
Thwien
2
/* Prediction */
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
* [[Lottery]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
1db516927590303a63a6e49527b6ede94820192e
Installation
0
3
81
49
2008-01-06T16:34:13Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann201.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
ba33d5f4235beca4415843a794b76c81f27a7406
Visual network topoloy
0
10
83
2008-01-06T16:41:46Z
Thwien
2
New page: == PNG image of network topology == <source lang="php"> require_once 'ANN/ANN_Network.php'; try { $network = ANN_Network::loadFromFile('xor.dat'); } catch(Exception $e) { print 'Netw...
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$image = new ANN_NetworkGraph($network);
$image->printImage();
</source>
6eaa0f211a6506e68b9a174b6e4fefb9d698f46d
84
83
2008-01-06T16:48:46Z
Thwien
2
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$image = new ANN_NetworkGraph($network);
$image->printImage();
</source>
== Output ==
[[Image:network.png]]
2de39a969b882e0e2ffd232d2eb0c360cba39fd7
86
84
2008-01-06T16:55:47Z
Thwien
2
/* Output */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$image = new ANN_NetworkGraph($network);
$image->printImage();
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
9bbcff365ea5532bc94f889b33cd93c0c69acaa1
88
86
2008-01-06T19:09:50Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$image = new ANN_NetworkGraph($network);
$image->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
c2ec0a98aca6f78736eeb4070c2bf33d7ebd4499
File:Network.png
6
11
85
2008-01-06T16:53:06Z
Thwien
2
Output image of class ANN_NetworkGraph
wikitext
text/x-wiki
Output image of class ANN_NetworkGraph
2b76869acee0f84853d3b2c1f9cde8da51728f55
Logging network weights
0
12
87
2008-01-06T17:00:57Z
Thwien
2
New page: == Logging network weights while training == <source lang="php"> require_once 'ANN/ANN_Network.php'; try { $network = ANN_Network::loadFromFile('xor.dat'); } catch(Exception $e) { pr...
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->logToFile('network.csv');
$network->train();
$network->saveToFile('xor.dat');
</source>
2e03f9ec226295850655b46f93dfaf51ea40431b
90
87
2008-01-06T20:56:43Z
Thwien
2
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->logToFile('network.csv'); // Logging starten
$network->train();
$network->saveToFile('xor.dat');
</source>
2198226df3f0dd20c734345db4b6e6020ff4c020
91
90
2008-01-06T20:57:04Z
Thwien
2
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->logToFile('network.csv'); // Start logging
$network->train();
$network->saveToFile('xor.dat');
</source>
e1fa81fb0332e018027c2f4df8d6346f55edc6f2
Lottery
0
13
93
2008-01-06T22:02:23Z
Thwien
2
New page: Sorry, if you will be disappointed. A '''prediction of numbers of lottery next weekend is not possible''' with neural networks. Lottery has no usable inputs and outputs to train it due to ...
wikitext
text/x-wiki
Sorry, if you will be disappointed. A '''prediction of numbers of lottery next weekend is not possible''' with neural networks. Lottery has no usable inputs and outputs to train it due to excessively randomness. It would not make sense to put all chosen numbers in the past to the neural network, because there is no relation between the possible numbers and the chosen ones. And if I had ever figured out such a prediction solution, I would never publish it here! ;-)
7f74c83cab47dc01f675b00024c40722a68c5fc0
Multilayer perceptron
0
14
101
2008-01-13T12:08:12Z
Thwien
2
New page: == General == A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output la...
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Error of network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer percetron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is mathematical difficult. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha {\partial E \over \partial w_{ij}} = \alpha \delta_{j} x_{i}</math>
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
505a312d21edcbff05880856497280ab71853a04
102
101
2008-01-13T13:31:45Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
1e557a8b1385de48a897d98edb3b76698c190997
103
102
2008-01-13T13:46:27Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Tangens hyperbolicus activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
728b7f3c080958a81fe13413b4c1e844d943ab99
104
103
2008-01-13T13:51:52Z
Thwien
2
/* Backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Tangens hyperbolicus activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
9c64a33a86114edef81aa900f495576198618898
107
104
2008-01-13T14:02:58Z
Thwien
2
/* Activation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Tangens hyperbolicus activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
35c29888e7cc0ac1dec4a0a2cc414f4b3ad235d2
108
107
2008-01-13T14:10:05Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Tangens hyperbolicus activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. Theses factors can be change by runtime.
07daaa1547ce5f019cbfa61f1e3c03255a51264b
110
108
2008-01-13T14:27:36Z
Thwien
2
/* Tangens hyperbolicus activation function */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. Theses factors can be change by runtime.
40333c6db28f6d55d5af0f0c627a6edd1aaec80c
111
110
2008-01-13T14:57:50Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. Theses factors can be change by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''. Using linear input values normalization is needed:
<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
<math>o >= 0.5 : True</math>
<math>o < 0.5 : False</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
6a4acf01c9167247d469c100aa9d7f56eee09604
112
111
2008-01-13T15:00:25Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. Theses factors can be change by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''. Using linear input values normalization is needed:
<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
<math>o >= 0.5 : True</math>
<math>o < 0.5 : False</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
40ae198ea8e866cfdcf83f42ec186a04c62d26bf
114
112
2008-01-13T16:40:44Z
Thwien
2
/* Binary and linear input */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
=== Sigmoid activation function ===
<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
<math>o = tanh(s)</math>
using output range between -1 and 1, or
<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. Theses factors can be change by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
<math>0 : False</math>
<math>1 : True</math>
Using linear input values normalization is needed:
<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
<math>o >= 0.5 : True</math>
<math>o < 0.5 : False</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
bea203a8c8e88cedfedb98290ff59207b8732850
115
114
2008-01-13T17:00:00Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS</math> = 1
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
To avoid overfitting of neural network the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
009965dcd16e65ed83ee0fab6fd23f53d428def2
117
115
2008-01-13T17:13:46Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS</math> = 1
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
== Choosing learning rate and momentum ==
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
7a75677d29e54ae0d3b10d2472ae58d72a74cb9f
118
117
2008-01-13T17:18:57Z
Thwien
2
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS</math> = 1
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers:
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
942c04750d70da709537a85cdab0cb10477ec1cb
120
118
2008-01-13T17:48:05Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS</math> = 1
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
6ed1b9e6fd487e33b43c4ff8fc72ce32a6845465
127
120
2008-01-15T10:53:02Z
Thwien
2
/* Activation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
31d67fbd9b113f18e59fff62076cb05be865f691
143
127
2008-01-17T12:47:56Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
b4eaa1bb81c3d6a52ab1558025dd8f4e0fc3c3d7
147
143
2008-01-17T18:34:24Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\delta w_{i}(t) = \frac{\part F(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\delta w_{i}(t) = \frac{\part F(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>F</math> error function
:<math>t</math> time (training step)
:<math>\lambda</math> weight decay factor
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
6ac9c6ac77e4994c9383d5a5ceb5ace8bf304020
148
147
2008-01-17T18:37:21Z
Thwien
2
/* Weight decay */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
bfe158473a0796be6632afbda8c5a4e8092de1b8
149
148
2008-01-17T18:55:12Z
Thwien
2
/* Weight decay */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
0ca116f34d50cefd362ee7b55d597d731197546d
150
149
2008-01-18T14:15:34Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
f16712a6363fe03a53342e55402800f248efbff2
Neural Networks
0
6
105
98
2008-01-13T13:57:48Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a multilayer perceptron for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
488f5bd2b4002a921f6779c33c335d5d92a5513e
113
105
2008-01-13T15:03:04Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a [[multilayer perceptron]] for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
a50da654c972b67af5636e749d31828e0ddafb68
Main Page
0
1
106
100
2008-01-13T13:59:01Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
3d8b51f615d2f9202821078082fda53a6c1ad285
109
106
2008-01-13T14:27:06Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
1d8ab10dec9759f06fcc0ebd0c2abb49f2332cbb
116
109
2008-01-13T17:06:33Z
Thwien
2
/* Overview */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
2d8adc4e0d806945e8db770134c7e687327b4ff9
119
116
2008-01-13T17:32:17Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.1 by Thomas Wien''' (2008-01-06) [[Download]]
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.2 by Thomas Wien (Development)'''
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Logging of network error changes for statistical usage
* Calculating total network error for csv logging
* Performance check depending on host system
* Support for dynamic learning rate
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
206579541a95128ffeb0fd177eac5f4f8d9237d1
123
119
2008-01-14T20:06:35Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.2 by Thomas Wien''' (2008-01-14) [[Download]]
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Support for dynamic learning rate
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
c91d8d1e7d2e4b8f76516561c66b22d907043429
133
123
2008-01-16T11:53:40Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.2 by Thomas Wien''' (2008-01-14) [[Download]]
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Support for dynamic learning rate
* Automatic epoch determination
* Shuffling input patterns each epoch instead of randomized pattern access
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
9eec3c60235edca4138daf390b4cad177334242e
135
133
2008-01-16T15:58:28Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.2 by Thomas Wien''' (2008-01-14) [[Download]]
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
84be31dced7003ecd1a998cbd68a5ddfec4ffc3b
136
135
2008-01-17T11:04:07Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
665e6d44c068b912ff9125adb497ec137f673a67
139
136
2008-01-17T11:17:37Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
dce267d447133aee4627811008e331b4d5d3aa91
141
139
2008-01-17T11:24:03Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
cd8106ae5e60a9b16b383d12f7343afedf7ec056
Examples
0
7
121
92
2008-01-14T19:53:59Z
Thwien
2
/* Several functions of ANN */
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
* [[Lottery]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
* [[Client-Server model]]
3788ccef6be33f08f26679f8085cba9820f5720f
Client-Server model
0
15
122
2008-01-14T20:03:15Z
Thwien
2
New page: == Server implementation == <source lang="php"> require_once 'ANN/ANN_Network.php'; require_once 'ANN/ANN_Server.php'; class ANN_MyServer extends ANN_Server { // ***********************...
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
require_once 'ANN/ANN_Server.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $username
* @param string $password
* @return boolean
*/
protected function checkLogin($username, $password)
{
// User-defined authentication by database for example
return ($username == 'username' && $password == 'password');
}
// ****************************************************************************
}
$server = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network = $network->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($network instanceof ANN_Network)
$network->printNetwork();
</source>
9710938d96333ff9249b7dac7e383a9d49d0cf49
Download
0
2
124
79
2008-01-14T20:14:26Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.1'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server.
== Version '''2.0.2''' (2008-01-14) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
e26ce78c86ae1eff5082a4a763e5a8da787ffeba
125
124
2008-01-14T20:23:59Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.2'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server.
== Version '''2.0.2''' (2008-01-14) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
dbbf1ec46ba9824ad29083f03a6ff18eecdd6b66
128
125
2008-01-15T11:02:10Z
Thwien
2
/* Requirements */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.2'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.2''' (2008-01-14) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
ad4556ef492414ab3000eea086dbaaa66df75a2b
137
128
2008-01-17T11:08:42Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.2'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.3''' (2008-01-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
298b734a638753e8e0472710a75811c03cf12305
138
137
2008-01-17T11:09:42Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.3'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.3''' (2008-01-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
04d0ef36709892df11ce3295ef2ed9e8ac1ef635
140
138
2008-01-17T11:21:51Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.3'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.3''' (2008-01-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
4bfb94b5d109756b227061b4770e211f84b0d0dd
142
140
2008-01-17T11:24:27Z
Thwien
2
/* Version '''2.0.3''' (2008-01-17) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.3'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.3''' (2008-01-17) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
3c5585ea7c7a66724e70af3df0c5f5944c143282
Installation
0
3
126
81
2008-01-14T20:27:14Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann202.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
c97e4b5970016971c3b6863b1dcf5e5b42d45f8c
Artificial Neural Network for PHP:General disclaimer
4
5
129
13
2008-01-15T11:49:15Z
Thwien
2
wikitext
text/x-wiki
[[Image:impressum.gif]]
e013bda0cafc77e2c6658424f1ffc5370416c45f
131
129
2008-01-15T11:51:04Z
Thwien
2
wikitext
text/x-wiki
== Disclaimer / Impressum (TDG §6, MDStV §10.2 ==
[[Image:impressum.gif]]
de1f69decf03cc2207a4b17b49e876bb4ca8a928
132
131
2008-01-15T11:51:23Z
Thwien
2
/* Disclaimer / Impressum (TDG §6, MDStV §10.2 */
wikitext
text/x-wiki
== Disclaimer / Impressum (TDG §6, MDStV §10.2) ==
[[Image:impressum.gif]]
b19903252ca495f55014a16374264bbeb17d300e
File:Impressum.gif
6
16
130
2008-01-15T11:49:40Z
Thwien
2
wikitext
text/x-wiki
da39a3ee5e6b4b0d3255bfef95601890afd80709
Selling Icecreams
0
9
134
68
2008-01-16T14:50:54Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
$network->setOutputType('linear');
$temperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$temperature->saveToFile('input_temperature.dat');
unset($temperature);
$humidity = new ANN_InputValue(0, 100); // Humidity percentage
$humidity->saveToFile('input_humidity.dat');
unset($humidity);
$icecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$icecream->saveToFile('output_quantity.dat');
unset($icecream);
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
98e1b512bb009ee7e798d42a58e0a862df708f7c
146
134
2008-01-17T13:06:19Z
Thwien
2
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
$temperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$temperature->saveToFile('input_temperature.dat');
unset($temperature);
$humidity = new ANN_InputValue(0, 100); // Humidity percentage
$humidity->saveToFile('input_humidity.dat');
unset($humidity);
$icecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$icecream->saveToFile('output_quantity.dat');
unset($icecream);
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
f5022fb86e21e1088d6a4d2e27b5b4fa821356ef
Copyright
0
4
144
12
2008-01-17T12:59:17Z
Thwien
2
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<code>
* Artificial Neural Network - Version 2.0
*
* For updates and changes visit the project page at http://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* This source file is freely re-distributable, with or without modifications
* provided the following conditions are met:
*
* 1. The source files must retain the copyright notice below, this list of
* conditions and the following disclaimer.
*
* 2. The name of the author must not be used to endorse or promote products
* derived from this source file without prior written permission. For
* written permission, please contact me.
*
* <b>DISCLAIMER</b>
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR `AS IS'' AND
* ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
* PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE PHP
* AUTHOR OR HIS CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version 2.0 by Thomas Wien
* @copyright Copyright (c) 2002 by Eddy Young
* @copyright Copyright (c) 2007-08 by Thomas Wien
* @package ANN
</code>
86cedf15cba7bfbe6be7512627e4aea7ffb22dfd
Logging network weights
0
12
145
91
2008-01-17T13:05:32Z
Thwien
2
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->logWeightsToFile('network.csv'); // Start logging
$network->train();
$network->saveToFile('xor.dat');
</source>
05568cba8f28bcbe52a8156d67f434f36fd5cfaf
Multilayer perceptron
0
14
151
150
2008-01-18T14:33:09Z
Thwien
2
/* Quick propagation algorithm */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
To avoid too big changes the maximum weight change is limited by the following equation:
:<math>\Delta w_{i}(t) \leq \mu \cdot \Delta w_{i}(t-1)</math>
:<math>\mu = [1.75 .. 2.25]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>t</math> time (training step)
:<math>\mu</math> maximal weight change factor
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
1455ee6d2c495d113fa0ad027908fb66d51215c7
152
151
2008-01-18T19:32:54Z
Thwien
2
/* Programming solution of backpropagation */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
To avoid too big changes the maximum weight change is limited by the following equation:
:<math>\Delta w_{i}(t) \leq \mu \cdot \Delta w_{i}(t-1)</math>
:<math>\mu = [1.75 .. 2.25]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>t</math> time (training step)
:<math>\mu</math> maximal weight change factor
=== RProp (Resilient Propagation) ===
The RProp algorithm just refers to the direction of the gradient.
<math>\Delta w_{ij}(t) = \begin{cases}
-\Delta w_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} > 0 \\
+\Delta w_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} < 0 \\
0, & \text{if } \frac{\part E}{\part w_{ij}} = 0
\end{cases}</math>
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\alpha^- \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
:<math>\alpha</math> learning rate
:<math>\alpha^+ = 1.2</math>
:<math>\alpha^- = 0.5</math>
:<math>\Delta w(0) = 0.5</math>
:<math>\Delta w(t)_{max} = 50</math>
:<math>\Delta w(t)_{min} = 0</math>
=== RProp+ ===
The RProp+ algorithm reduce the previous weight change from the last weight change if the mathematical sign of the gradient changes.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
=== iRProp+ ===
The iRProp+ is a improve RProp+ algorithm with a little change. Before reducing the previous weight change from the last weight change, the network error will be calculated and compared. If the network error increases from <math>E(t-2)</math> to <math>E(t-1)</math>, then the procedure of RProp+ will be done. Otherwise no change will be done, because if <math>E(t-1)</math> has a lower value than <math>E(t-2)</math> the weight change seems to be correct to convergent the neural network.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0 \text{ and if } E(t) > E(t-1)
\end{cases}</math>
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
943867c1329d5102bab7bffebd9690707358f94a
155
152
2008-01-19T15:14:12Z
Thwien
2
/* RProp (Resilient Propagation) */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
To avoid too big changes the maximum weight change is limited by the following equation:
:<math>\Delta w_{i}(t) \leq \mu \cdot \Delta w_{i}(t-1)</math>
:<math>\mu = [1.75 .. 2.25]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>t</math> time (training step)
:<math>\mu</math> maximal weight change factor
=== RProp (Resilient Propagation) ===
The RProp algorithm just refers to the direction of the gradient.
<math>\Delta w_{ij}(t) = \begin{cases}
-\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} > 0 \\
+\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} < 0 \\
0, & \text{if } \frac{\part E}{\part w_{ij}} = 0
\end{cases}</math>
<math>\Delta p_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\alpha^- \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
:<math>\alpha</math> learning rate
:<math>w</math> weight
:<math>p</math> weight change
:<math>\alpha^+ = 1.2</math>
:<math>\alpha^- = 0.5</math>
:<math>\Delta w(0) = 0.5</math>
:<math>\Delta w(t)_{max} = 50</math>
:<math>\Delta w(t)_{min} = 0</math>
=== RProp+ ===
The RProp+ algorithm reduce the previous weight change from the last weight change if the mathematical sign of the gradient changes.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
=== iRProp+ ===
The iRProp+ is a improve RProp+ algorithm with a little change. Before reducing the previous weight change from the last weight change, the network error will be calculated and compared. If the network error increases from <math>E(t-2)</math> to <math>E(t-1)</math>, then the procedure of RProp+ will be done. Otherwise no change will be done, because if <math>E(t-1)</math> has a lower value than <math>E(t-2)</math> the weight change seems to be correct to convergent the neural network.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0 \text{ and if } E(t) > E(t-1)
\end{cases}</math>
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
6b953f4d6dc090b7d8e1a4770fb1622b579a3d15
156
155
2008-01-19T18:31:36Z
Thwien
2
/* iRProp+ */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
This PHP implementation supports dynamic learning rate by default.
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
To avoid too big changes the maximum weight change is limited by the following equation:
:<math>\Delta w_{i}(t) \leq \mu \cdot \Delta w_{i}(t-1)</math>
:<math>\mu = [1.75 .. 2.25]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>t</math> time (training step)
:<math>\mu</math> maximal weight change factor
=== RProp (Resilient Propagation) ===
The RProp algorithm just refers to the direction of the gradient.
<math>\Delta w_{ij}(t) = \begin{cases}
-\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} > 0 \\
+\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} < 0 \\
0, & \text{if } \frac{\part E}{\part w_{ij}} = 0
\end{cases}</math>
<math>\Delta p_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\alpha^- \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
:<math>\alpha</math> learning rate
:<math>w</math> weight
:<math>p</math> weight change
:<math>\alpha^+ = 1.2</math>
:<math>\alpha^- = 0.5</math>
:<math>\Delta w(0) = 0.5</math>
:<math>\Delta w(t)_{max} = 50</math>
:<math>\Delta w(t)_{min} = 0</math>
=== RProp+ ===
The RProp+ algorithm reduce the previous weight change from the last weight change if the mathematical sign of the gradient changes.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
=== iRProp+ ===
The iRProp+ is a improve RProp+ algorithm with a little change. Before reducing the previous weight change from the last weight change, the network error will be calculated and compared. If the network error increases from <math>E(t-2)</math> to <math>E(t-1)</math>, then the procedure of RProp+ will be done. Otherwise no change will be done, because if <math>E(t-1)</math> has a lower value than <math>E(t-2)</math> the weight change seems to be correct to convergent the neural network.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \text{ and if } E(t) > E(t-1) \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
cab1f699a439628b8e10d29344d5e175401152da
Main Page
0
1
153
141
2008-01-18T21:07:41Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.4 by Thomas Wien''' (Development)
* Weight Decay
* Quick propagation algortihm (experimental)
* iRProp+ algorithm (testing)
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
9ef47e7eba0f1ed72809fe4a0e5edfed0f520e86
158
153
2008-01-22T21:16:44Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.4 by Thomas Wien''' (Development)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
50b8b8b0f7ba474a66cb06a96ba016337facf020
159
158
2008-01-22T21:51:58Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.4 by Thomas Wien''' (Development)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
504f404d238da3c4633b0b5c28a1144f0f468e16
160
159
2008-01-27T17:45:08Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.0.3 by Thomas Wien''' (2008-01-17) [[Download]]
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.4 by Thomas Wien''' (Development)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
0b832037493441cdb907a90bbbee59e3340a13ba
161
160
2008-01-27T19:32:18Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.4 by Thomas Wien''' (2008-01-27) [[Download]]
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.5 by Thomas Wien''' (Development)
* Implementation of synapses
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
f0230e6322a07a16e40ad6a7cfe1698673840c71
163
161
2008-05-16T14:43:37Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.4 by Thomas Wien''' (2008-01-27) [[Download]]
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.5 by Thomas Wien''' (Development)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
== Todo ==
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
b95f1a96b2056e9ececc41da1b2291f4e3ee0525
164
163
2008-05-16T14:43:47Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' in 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.4 by Thomas Wien''' (2008-01-27) [[Download]]
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.5 by Thomas Wien''' (Development)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
bce8a408ce9d7622d2404bf6fa555114878e83b1
165
164
2008-05-16T14:47:55Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.4 by Thomas Wien''' (2008-01-27) [[Download]]
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.5 by Thomas Wien''' (Development)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
52b7fb9acc50a1d1fa131dd75ee7a82121815919
167
165
2008-12-16T17:17:47Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.4 by Thomas Wien''' (2008-01-27) [[Download]]
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.5 by Thomas Wien''' (Development)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
045609cdba50e0da5105047dd3387773f7cd4ab4
169
167
2008-12-16T17:38:37Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* [todo]
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
90a1e6d11c976b209fcd538edd49ff7be40a9d57
171
169
2008-12-16T17:45:34Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* [todo]
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
08ce23b272d4380451100c14e120606fbddd4859
182
171
2008-12-16T18:52:24Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* [todo]
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
acdfd1346f2e9a611a5dbcbda9a7ecf9894e2b23
184
182
2008-12-18T12:32:46Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
32b4984232648fa2d2b3339a9d9205688aefe344
194
184
2008-12-18T13:58:36Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
674447b1ae640e6eb2d06288832b2fa12d53a071
195
194
2008-12-18T15:49:32Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
722ff78597e29f14c6507d247a2543e3fa8f642c
196
195
2008-12-18T16:06:15Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
d45957ace715a249d0465c1286acec2f87ee0cfa
197
196
2008-12-18T16:07:03Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.5 by Thomas Wien''' (2008-12-16) [[Download]]
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.6 by Thomas Wien''' (Development)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
40cd803526a697472042ed00c5adc3c4fa9f55d8
200
197
2008-12-18T17:47:17Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* [todo]
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
81d354b10f800bb164fcef4de076e3053ab8a445
Neural Networks
0
6
154
113
2008-01-18T21:27:41Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a [[multilayer perceptron]] for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Network traffic estimation:''''' Prediction how much traffic is to be expected for a sever cluster to a specific time of a day.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
c600a0d6e2ccf9de9b23c068499f924b5e18e663
157
154
2008-01-22T21:06:53Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a [[multilayer perceptron]] for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Network traffic estimation:''''' Prediction how much traffic is to be expected for a sever cluster to a specific time of a day.
** '''''Cancer detection:''''' Cancer detection on several medical test results.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
4607bfe2398d0ed062450422ce8f24ac38b3c386
166
157
2008-07-03T13:47:15Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a [[multilayer perceptron]] for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Network traffic estimation:''''' Prediction how much traffic is to be expected for a sever cluster to a specific time of a day.
** '''''Cancer detection:''''' Cancer detection on several medical test results.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
** '''''Compiler optimization:''''' It's planned to extend the compiler ''gcc'' by neural networks to learn how to compile code with higher performance without manually optimizing the source code.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
4566a2e16618a0710ff1b079e887c419559314a9
183
166
2008-12-18T12:29:39Z
Thwien
2
/* Questions and Answers */
wikitext
text/x-wiki
== Questions and Answers ==
* '''What is a neural network?'''
An artificial neural network is a mathematical model of an associative biological neural network.
* '''What are the differences between a biological and an artificial neural network?'''
Between both networks are huge differences. The neurons of a biological neural networks works parallel massively and such a network has billions or trillions of neurons connected massively. In comparison the artificial neural networks has just a few count of neurons depending on the among of data to associate by the network, e.g. 20 neurons. Such an artificial network is not connected by accident, but as a controlled and planned layer structure, where is an input layer, one or a few hidden layers and an output layer (if a [[multilayer perceptron]] topology is used). As an artificial neural network is a computer software it works sequentially and therefore there is a big difference to a biological neural network, of course.
* '''What is the benefit of an artificial neural network?'''
The big advantage of human brain is the adaption of seen unknown things to a learned model stored in the brain's neurons. This is called association. Technically it is called an unsharpened information processing, so an artificial neural network can be used in an environment, where input information is not complete (like face recognition).
* '''What are examples for technical use of artificial neural networks?'''
** '''''Prediction of power consumption:''''' The local electricity factory in Duesseldorf, Germany is using a [[multilayer perceptron]] for daily prediction of power consumption in the city referring to temperature, humidity, weekday, and others.
** '''''Prediction of selling products:''''' Prediction on how many articles of a product in a supermarket will be sold in one week. With this information it is possible to optimize ordering and storage.
** '''''Prediction of incoming calls to a call centre:''''' Prediction on the among of daily calls in a call centre to plan how many co-workers have to work that day.
** '''''Post code recognition:''''' The German Post is using neural networks in recognition of post codes (PLZ) written on letters by computer or manually. After recognition a computer readable code is printed on the letter so further recognition is not necessary again.
** '''''Solving mathematical functions:''''' Artificial neural networks are also used to find a numeric solution of difficult mathematical functions. The network can associate the output of a function getting one or more unknown inputs. This is just possible by learning known inputs and outputs. The artificial network has an "idea" of the mathematical rule.
** '''''Classification:''''' A general classification can be done also with a neural network. For example such a network can detect by an image if the person displayed is looking left or right, is laughing or not, male or female, and so on.
** '''''Intelligent network routing:''''' A network router which learns the fastest routing of internet packets could be solved by using neural networks. Therefore this router can optimize routing decisions.
** '''''Detecting spam mails:''''' Neural networks are also used to detect spam mails in mail clients or on mail servers.
** '''''Stock market:''''' Prediction of stock values by market rules and psychological reactions.
** '''''Credit assignments:''''' Determine bad credit risk of bank customers by including different aspects.
** '''''Network traffic estimation:''''' Prediction how much traffic is to be expected for a sever cluster to a specific time of a day.
** '''''CRAP (Change Risk Analysis and Prediction:''''' Analysing of error patterns in programming structures to predict possible errors in selected modules or methods could happen while refactoring software based on experiences the neural network did while testing the software.
** '''''Cancer detection:''''' Cancer detection on several medical test results.
** '''''Air traffic control'''''
** '''''Robot control'''''
** '''''Game strategy control'''''
** '''''Noise tolerance of analogue modems'''''
** '''''Scheduling buses, trams, air planes and elevators'''''
** '''''Optimization of traffic flows'''''
** '''''Weather forecast'''''
** '''''Music composition:''''' A neural network can learn rules of composition and create own music based on them. (But until today I have never heard neural composed music.)
** '''''Compiler optimization:''''' It's planned to extend the compiler ''gcc'' by neural networks to learn how to compile code with higher performance without manually optimizing the source code.
* '''How to train an artificial neural network?'''
Like a human brain has to do, too. Learning by doing or learning, learning and learning. The artificial neural network gets inputs and produces an output. This output will be compared with aimed output. If there is a difference between them, the network must be changed. This procedure is done as often as the output fits to the aimed output. After this the artificial networks "knows" the rules to get the right output. This will be done for all known inputs and outputs. After training you can use the "knowledge" of the network to predict recognize output patterns to their related known or unknown inputs.
* '''Can an artificial neural network forget?'''
Yes. Like a human brain is an associative working processor also an artificial network can do mistakes and can forget knowledge if inputs are trained rarely. If you as a human being trains playing piano rarely, so you cannot play it perfectly. This is similar to the artificial network.
* '''Has an artificial neural network consciousness?'''
The one and only answer can be: '''No'''. How consciousness comes to reality in a biological neural network isn't figured out until today, but for an artificial neural network it can be said surely that there is no consciousness. It would also not get consciousness if the complexity of the artificial network is quite huge. Therefore an artificial neural network cannot die, feel pain or feel depressions.
* '''Does have artificial neural networks has disadvantages?'''
Yes, there are a few problems. Today are existing a number of neural network topologies. Each topology is perfect to solve a special kind of problems. Neural networks are slow in training, because training can be several ten thousands of loops. And a neural network can be overfit if it is overtrained. A problem is also the extrapolation of data if data is not presented properly to the neural network.
* '''Which kinds of artificial neural networks are existing?'''
There are several. For example: [[multilayer perceptron]], self-organizing maps (Kohonen net), Hoppfield net, generic maps, and further more.
== Information about Neural Networks ==
English
* [http://en.wikipedia.org/wiki/Neural_Network Neural Network (en.wikipedia.org)]
* [http://en.wikipedia.org/wiki/Multilayer_perceptron Multilayer Perceptron (en.wikipedia.org)]
German
* [http://de.wikipedia.org/wiki/K%C3%BCnstliches_neuronales_Netz Künstliches Neuronales Netz (de.wikipedia.org)]
* [http://de.wikipedia.org/wiki/Perzeptron#Mehrlagiges_Perzeptron Mehrlagiges Perzeptron (de.wikipedia.org)]
3ef0c3260af585b827ee3251c3bdf8f62098c002
Download
0
2
162
142
2008-01-27T19:41:10Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.4'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.4''' (2008-01-27) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
d2555924f346e5ff8bad7e285df6e65d5a6b01b0
168
162
2008-12-16T17:36:47Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.4'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.5''' (2008-12-16) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
0c9f8e0d6b4bf8a019a7eab3e2d10e5dbb5421a6
170
168
2008-12-16T17:43:17Z
Thwien
2
/* Version '''2.0.5''' (2008-12-16) '''''stable''''' */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.4'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.5''' (2008-12-16) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
fbe2356c968e38b24dabdc5ad00edd4bb49bbd4a
198
170
2008-12-18T17:44:42Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.4'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.6''' (2008-12-18) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
e4af6e256a5c17dd86310fa5fe88c3ec12aef08d
199
198
2008-12-18T17:46:08Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.6'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.6''' (2008-12-18) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Documentation (online)'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
dc27b4b172482a142414e41de3c45f822b4c264e
Logical XOR function
0
8
172
77
2008-12-16T17:48:00Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
4d2e280e323466088959f8a523d0ec1fe415b5b4
173
172
2008-12-16T17:48:11Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
a0a9ad8649fc9fd9d67e426d85ce1ac113ca3396
185
173
2008-12-18T12:41:24Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork ->setInputs($arrInputs);
$objNetwork ->setOutputs($arrOutputs);
$objNetwork ->train();
$objNetwork ->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$network->setInputs($inputs);
print_r($network->getOutputs());
</source>
3e7eeb99472022aeb45120a1868b9cfe4fa5bc06
186
185
2008-12-18T12:42:11Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork ->setInputs($arrInputs);
$objNetwork ->setOutputs($arrOutputs);
$objNetwork ->train();
$objNetwork ->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$objNetwork->setInputs($arrInputs);
print_r($objNetwork->getOutputs());
</source>
32d0bb9482e87df7e9df000dd11f2e3bdb3ebf6a
187
186
2008-12-18T12:42:24Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$objNetwork->setInputs($arrInputs);
print_r($objNetwork->getOutputs());
</source>
7754b34bb971dfdc8587fd2d378887545505e4bd
Selling Icecreams
0
9
174
146
2008-12-16T17:48:33Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
$temperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$temperature->saveToFile('input_temperature.dat');
unset($temperature);
$humidity = new ANN_InputValue(0, 100); // Humidity percentage
$humidity->saveToFile('input_humidity.dat');
unset($humidity);
$icecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$icecream->saveToFile('output_quantity.dat');
unset($icecream);
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
b1db368003798f76c67c1ae41ebac73094b5d8c2
175
174
2008-12-16T17:48:41Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(2,8,1);
$temperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$temperature->saveToFile('input_temperature.dat');
unset($temperature);
$humidity = new ANN_InputValue(0, 100); // Humidity percentage
$humidity->saveToFile('input_humidity.dat');
unset($humidity);
$icecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$icecream->saveToFile('output_quantity.dat');
unset($icecream);
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$outputs = array(
array($icecream->getOutputValue(20)),
array($icecream->getOutputValue(90)),
array($icecream->getOutputValue(70)),
array($icecream->getOutputValue(75))
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->train();
$network->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
0a4372caa820b5cc8648eaacd11fbb9d3c6fd0d5
188
175
2008-12-18T12:47:53Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
unset($objTemperature);
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
unset($objHumidity);
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$arrOutputs = array(
array($objIcecream->getOutputValue(20)),
array($objIcecream->getOutputValue(90)),
array($objIcecream->getOutputValue(70)),
array($objIcecream->getOutputValue(75))
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$temperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$humidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$icecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$inputs = array(
array($temperature->getInputValue(20), $humidity->getInputValue(10)),
array($temperature->getInputValue(30), $humidity->getInputValue(40)),
array($temperature->getInputValue(32), $humidity->getInputValue(30)),
array($temperature->getInputValue(33), $humidity->getInputValue(20))
);
$network->setInputs($inputs);
$outputs = $network->getOutputs();
foreach($outputs as $output)
print $icecream->getRealOutputValue($output). '<br />';
</source>
aa5527f35e7875d8b8a9a1c7bf6f2b6e3bbf2098
189
188
2008-12-18T12:51:09Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
unset($objTemperature);
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
unset($objHumidity);
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$arrOutputs = array(
array($objIcecream->getOutputValue(20)),
array($objIcecream->getOutputValue(90)),
array($objIcecream->getOutputValue(70)),
array($objIcecream->getOutputValue(75))
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$objNetwork->setInputs($arrInputs);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
94421bccf963eb78f1eef08f52d8c7319bf8072f
Visual network topoloy
0
10
176
88
2008-12-16T17:49:02Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$image = new ANN_NetworkGraph($network);
$image->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
489a0c1bb4c80d0cc0ccaea0096e933d3525024c
190
176
2008-12-18T12:52:23Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Network not found';
exit;
}
$objNetworkImage = new ANN_NetworkGraph($objNetwork);
$objNetworkImage->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
679a5bb6ef062167d15d6bc426092b657d9ad14f
Logging network weights
0
12
177
145
2008-12-16T18:18:13Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network->logWeightsToFile('network.csv'); // Start logging
$network->train();
$network->saveToFile('xor.dat');
</source>
9f0c1fd0d6733235ce7d44f179bcd15106564b28
191
177
2008-12-18T12:53:44Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
429608ffb9cfeb4fffb34c5e5a998f06dfd8eb82
Client-Server model
0
15
178
122
2008-12-16T18:19:13Z
Thwien
2
/* Server implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $username
* @param string $password
* @return boolean
*/
protected function checkLogin($username, $password)
{
// User-defined authentication by database for example
return ($username == 'username' && $password == 'password');
}
// ****************************************************************************
}
$server = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Network.php';
try
{
$network = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network = $network->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($network instanceof ANN_Network)
$network->printNetwork();
</source>
ef2d671f92bddfd3a64a44ec159a6c72b3241595
179
178
2008-12-16T18:19:22Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $username
* @param string $password
* @return boolean
*/
protected function checkLogin($username, $password)
{
// User-defined authentication by database for example
return ($username == 'username' && $password == 'password');
}
// ****************************************************************************
}
$server = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network = $network->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($network instanceof ANN_Network)
$network->printNetwork();
</source>
43445d484bda8091f71f613fc485009087be8fae
192
179
2008-12-18T12:56:56Z
Thwien
2
/* Server implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$network = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$inputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$outputs = array(
array(0),
array(1),
array(1),
array(0)
);
$network->setInputs($inputs);
$network->setOutputs($outputs);
$network = $network->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($network instanceof ANN_Network)
$network->printNetwork();
</source>
9f6da6612bd957a6f28f9be2ee4f044b731cdc32
193
192
2008-12-18T12:59:06Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork = $objNetwork->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
2b94f02a0935f6898a53c102fc83d8905dd38a6b
Installation
0
3
180
126
2008-12-16T18:21:48Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann205.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Network.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
60ae91881aea6a1888938529758eb437bdc9e7c6
181
180
2008-12-16T18:22:01Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann205.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
9c406b2478c43dd50099f402a127f884cc30e9b2
Logical XOR function
0
8
201
187
2008-12-18T17:51:46Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$objNetwork->setInputs($arrInputs);
print_r($objNetwork->getOutputs());
</source>
5a8e1166f75cd17d3bb80de9108a3ad492aa9caf
202
201
2008-12-18T17:54:34Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
a505d5404f5efcc041618a74dcfbea64c791dbe8
247
202
2009-05-25T16:22:14Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completly. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
629083b9c353cfa20d920299c89635711757976f
249
247
2009-05-25T16:29:15Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
72bfb18df173f653357dcff70fd0c488a8eb9787
Selling Icecreams
0
9
203
189
2008-12-18T17:57:20Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
unset($objTemperature);
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
unset($objHumidity);
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$arrOutputs = array(
array($objIcecream->getOutputValue(20)),
array($objIcecream->getOutputValue(90)),
array($objIcecream->getOutputValue(70)),
array($objIcecream->getOutputValue(75))
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$objNetwork->setInputs($arrInputs);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
470d8958f501034bc72f780e87ef0932d5361cd3
206
203
2008-12-19T06:25:41Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Network not found.';
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
$arrInputs = array(
array($objTemperature->getInputValue(20), $objHumidity->getInputValue(10)),
array($objTemperature->getInputValue(30), $objHumidity->getInputValue(40)),
array($objTemperature->getInputValue(32), $objHumidity->getInputValue(30)),
array($objTemperature->getInputValue(33), $objHumidity->getInputValue(20))
);
$objNetwork->setInputs($arrInputs);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
6a9c88cc64a520bee9209b35d9998f37d6e67c42
207
206
2008-12-19T06:31:55Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
dba613728041fae09aba3c842c835cedf5d4b7de
208
207
2008-12-19T06:33:33Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
9fb557baa248003d4a4fe7e64a37609c3c3e8a3f
218
208
2008-12-23T20:40:00Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
print 'Error loading value objects';
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
6f29a6307deedf7469e0c4475c8a0557c7c42b28
226
218
2008-12-28T14:44:13Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->train();
$objNetwork->saveToFile('icecreams.dat');
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
bb1345ed9f6be556a7e50455c044349223737058
248
226
2009-05-25T16:23:21Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completly. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
69c41d1ab7df2f333a2707f7e4d70a7a265bd4b1
250
248
2009-05-25T16:29:40Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
ca5d2087c84f641da7a52d563560567687aa9c1b
Logging network weights
0
12
204
191
2008-12-18T17:57:54Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
0835a8eb0f7715b0ab7aa8a3d9f6b78eafcb511c
211
204
2008-12-19T06:41:16Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
c5de7be43d6feb969bc4f866927ed026cc3b36b1
212
211
2008-12-19T06:41:24Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
c0396e3f53a39275c5e398cbd9437728646aaf73
Client-Server model
0
15
205
193
2008-12-18T17:58:25Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = new ANN_Network;
}
catch(Exception $e)
{
print 'Network could not be created';
exit;
}
$arrInputs = array(
array(0, 0),
array(0, 1),
array(1, 0),
array(1, 1)
);
$arrOutputs = array(
array(0),
array(1),
array(1),
array(0)
);
$objNetwork->setInputs($arrInputs);
$objNetwork->setOutputs($arrOutputs);
$objNetwork = $objNetwork->trainByHost('username', 'password', 'http://example.tld/ANN_Server.php');
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
42f168c75496407ba2dc1ecde60ac7ffbd212bc1
209
205
2008-12-19T06:38:21Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
THIS EXAMPLE IS NOT UP TO DATE. PLEASE HAVE A LOOK TO XOR EXAMPLE!!!
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
0c8bbf7d93776e7359b2170102d64818c51f0435
210
209
2008-12-19T06:38:31Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
b2a733787ec6cc9366b21b0d6c371c632b710857
User talk:Thwien
3
17
213
2008-12-23T15:25:09Z
Iber
10
New page: Hey Thomas, My name is Ilya, I am a PHP developer from Toronto, Canada. I absolutely adore your project and the approach you took. It would be great to see more practical examples, simple...
wikitext
text/x-wiki
Hey Thomas,
My name is Ilya, I am a PHP developer from Toronto, Canada.
I absolutely adore your project and the approach you took.
It would be great to see more practical examples, simple games like TicTacToe (http://www.neuralplay.com/).
Let me know if you need any help - I would love to contribute!
Thanks and happy holidays,
Ilya Ber
ilya@blackriverweb.com
8efa1489b6504111c923ce646d5aa294b1459304
214
213
2008-12-23T15:25:22Z
Iber
10
wikitext
text/x-wiki
Hey Thomas,
My name is Ilya, I am a PHP developer from Toronto, Canada.
I absolutely adore your project and the approach you took.
It would be great to see more practical examples, simple games like TicTacToe (http://www.neuralplay.com/).
Let me know if you need any help - I would love to contribute!
Thanks and happy holidays,
Ilya Ber
ilya@blackriverweb.com
79308871cc3a7569dffdc59dccb438cdc1e08680
220
214
2008-12-23T20:56:44Z
Thwien
2
New section: [[User talk:Thwien#Adding code examples|Adding code examples]]
wikitext
text/x-wiki
Hey Thomas,
My name is Ilya, I am a PHP developer from Toronto, Canada.
I absolutely adore your project and the approach you took.
It would be great to see more practical examples, simple games like TicTacToe (http://www.neuralplay.com/).
Let me know if you need any help - I would love to contribute!
Thanks and happy holidays,
Ilya Ber
ilya@blackriverweb.com
== Adding code examples ==
Hi Ilya, thanks a lot for using this PHP5 library. Feel free to add more code examples to show how to use this library. Please, consider to use the latest stable version for your examples. I appreciate any help by you.
4cc50f71b32d809b090883d3af5e4a7e0c92fb48
Talk:Selling Icecreams
1
18
215
2008-12-23T19:53:26Z
Iber
10
New page: foreach($arrOutputs as $floatOutput) print $objIcecream->getRealOutputValue($floatOutput). '<br />'; should be foreach($arrOutputs as $floatOutput) print $objIcecream->getRealOutp...
wikitext
text/x-wiki
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
should be
foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput[0]). '<br />';
As $floatOutput is an Array
865275ce6f9a46b54010a01d2e4ffc1c8dcad643
216
215
2008-12-23T19:54:11Z
Iber
10
wikitext
text/x-wiki
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput) ...
Should be
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput[0] ...
As $floatOutput is an Array
e42185bfdd281ff23fc511f5fa70d4e38faca37c
217
216
2008-12-23T19:54:28Z
Iber
10
wikitext
text/x-wiki
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput) ...
Should be
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput[0] ...
As '''$floatOutput''' is an ''Array''
0ea253e407a862459c6b83e28d24fb9a0209bd4a
219
217
2008-12-23T20:49:34Z
Thwien
2
New section: [[Talk:Selling Icecreams#Handling output values|Handling output values]]
wikitext
text/x-wiki
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput) ...
Should be
... foreach($arrOutputs as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput[0] ...
As '''$floatOutput''' is an ''Array''
== Handling output values ==
Yes, that example code was wrong. I changed the lines to the correct code. Thanks a lot for your comment.
132514615771e70a98297f1c123290d536608416
Main Page
0
1
221
200
2008-12-27T13:42:24Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* ANN_Neuron: removing protected method setOutput()
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
bfbc246547d401268c55b5f6ef3f379013b4dda5
224
221
2008-12-28T13:53:53Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
80811099af28be6f441c8f627173173ed3a16e5d
228
224
2008-12-28T15:01:53Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
56300073e5152a31253fbba039e94064dd85bc40
231
228
2008-12-28T20:05:37Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
777b5f4de3d405cba1d55341d76a1d78e829c544
232
231
2008-12-28T21:16:23Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Testing cumulative input in ANN_Neuron::adjustWeights() '''[Todo]'''
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
430116d56eb47333756e8de620939e46230479cb
233
232
2009-01-01T13:55:16Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Testing cumulative input in ANN_Neuron::adjustWeights() '''[Todo]'''
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses
* Examples
* ANN_InputArray + ANN_OutputArray
* Performance check depending on host system
* Wiki: More details to installation and use
* Wiki: Project specific logo ( '''''Done!''''' )
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license of source code
3396541499b10c7877d132c41f729103951b9558
234
233
2009-01-01T14:02:11Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Testing cumulative input in ANN_Neuron::adjustWeights() '''[Todo]'''
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
d424d9a0a3bf06dc70345f2b06b6a10e2760c835
235
234
2009-01-01T14:02:50Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Testing cumulative input in ANN_Neuron::adjustWeights() '''[Todo]'''
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
bc97e12b3ac63c9d6b0885d518f506557f7c3a9a
236
235
2009-01-01T14:18:06Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Testing cumulative input in ANN_Neuron::adjustWeights() '''[Todo]'''
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
a67bcd2c85e673c5d1856f3f1a53a556bd762292
237
236
2009-01-01T14:48:34Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments depending on output type in ANN_Neuron::adjustWeights()
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
8e44edecc50978ffa51bf8ecee3c969a88e5234f
238
237
2009-01-01T14:50:29Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments depending on output type in ANN_Neuron::adjustWeights()
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
f99166aeeff83572c93e0b3e0daf66413265275c
239
238
2009-01-01T14:55:34Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.6 by Thomas Wien''' (2008-12-18) [[Download]]
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.7 by Thomas Wien''' (Development)
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
73bbd4ac56cf99fa1e29736940d962bd9936c8a9
241
239
2009-01-01T15:08:21Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* [todo]
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
e677f7918126502f268550f169d49eab5a4ad57e
243
241
2009-01-01T15:50:33Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
d1c3d6d34227c3490b8c0708a77e57c9b7960330
244
243
2009-01-01T18:12:31Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
1f54c3369c41edab2fbe92dfbb8b62169b73a211
245
244
2009-01-01T20:08:17Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
7239af6dcc79795c400f78a979ae409d87796f4d
246
245
2009-01-02T21:30:16Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
* Adding error codes to exceptions
* Exception if network error does not reach minimum
df745bfd47a8d7b77e3a906ca459e4163c13d3bd
Installation
0
3
222
181
2008-12-27T13:54:53Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann206.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$ann = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
575c7c012e1e916b7b5c50c1a075401328f2cbb5
223
222
2008-12-27T13:57:30Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann206.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
805c22fba3c9b4ef00fed30972d9ffb54cffc057
229
223
2008-12-28T18:26:25Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann206.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
e90741b3413cd6639ac5648835664c898434c031
242
229
2009-01-01T15:08:43Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann207.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
94920eb78fdf270d8111ab8faf96326d705cc39e
Download
0
2
225
199
2008-12-28T14:41:10Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
Current stable version of ANN implementation for PHP 5.x is '''version 2.0.6'''. Go to [[Installation]] section to get information on how to implement these PHP libraries into your project.
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Version '''2.0.6''' (2008-12-18) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
[http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
'''Change-Log'''
* Initial version
8e0adce30ec6d841d06ba711dd5050a77484deb0
230
225
2008-12-28T18:31:59Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.0.6''' (2008-12-18) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
94f045ebd414939ff34e0da2cc691656e8a67b14
240
230
2009-01-01T15:06:56Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
081050f2afdcf1ef0ef73e9d2a7e0942605fbec0
Visual network topoloy
0
10
227
190
2008-12-28T14:45:12Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$objNetworkImage = new ANN_NetworkGraph($objNetwork);
$objNetworkImage->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
09f7beb2a43b0a61e9aef1e19a3025b58d2fb915
Main Page
0
1
251
246
2009-10-28T07:20:44Z
Thwien
2
/* Overview */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
'''Version 2.0.8 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
* Adding error codes to exceptions
* Exception if network error does not reach minimum
e869c9bbfb4dbe1b554cd9fadf5558522b134180
256
251
2009-12-21T14:38:46Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.1 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
Planned
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
* String association support
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* Implementation of synapses (?)
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces
* Improving license agreement of source code
* Class parsing before including (compiler switches)
* Adding error codes to exceptions
* Exception if network error does not reach minimum
644b872b3e6063f67413b688aa5abe3f465faeba
257
256
2009-12-21T14:40:27Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.1 by Thomas Wien''' (Development)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
Planned
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
* String association support
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
cc7d81faefbd703c30b8b1f468804ce9ee7140b3
259
257
2009-12-22T11:24:09Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Momentum
* Dynamic learning rate
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* Weight decay
== Versions and Change-Log ==
'''Version 2.1.1 by Thomas Wien''' (Development)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22) [[Download]]
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
08fd873ee00ba4b90ec791ce628b3870b5b15fd8
260
259
2009-12-22T11:24:33Z
Thwien
2
/* Features */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.1.1 by Thomas Wien''' (Development)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22) [[Download]]
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
37684359e2d4b647c06ea3b95476ee6c2184f9e9
273
260
2009-12-22T18:53:22Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.1.1 by Thomas Wien''' (Development)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22) [[Download]]
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
f677c6464b66553a2df1d053ba5ce80818a93dce
275
273
2009-12-23T17:06:20Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.1.2 by Thomas Wien''' (Development)
* Classification support
'''Version 2.1.1 by Thomas Wien''' (2009-12-23) [[Download]]
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22) [[Download]]
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
f896a48794a3d6b42fc385ecd915ba917c960c08
282
275
2009-12-26T14:52:54Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
d5bdd50d6ab75507aaa6bd7189ba29661f565395
283
282
2009-12-26T14:54:43Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
6b32f0d2f3097301f406759d202c99be55918ff7
284
283
2009-12-26T14:55:35Z
Thwien
2
/* Features */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
d1f18feafeb7d16657aa9420676a217db2788103
288
284
2009-12-27T12:22:48Z
Thwien
2
/* Features */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
bc14948c704f5785fbbb3a7c478379ab1b986bcd
290
288
2009-12-27T13:28:17Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
4bfa5446da2d7e5294e18496175b1de51b2acd7f
294
290
2009-12-28T18:02:45Z
Thwien
2
/* Donate */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[[Image:Donate.gif |link=http://ann.thwien.de/donate/donate.html ]]
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
fb4d37ced6727c4be55d8596c7d76a504325df38
295
294
2009-12-28T18:04:35Z
Thwien
2
/* Donate */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.3 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.2 by Thomas Wien''' (2009-12-26) [[Download]]
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
4bfa5446da2d7e5294e18496175b1de51b2acd7f
297
295
2010-01-06T18:23:00Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.4 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.3 by Thomas Wien''' (2010-01-06) [[Download]]
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
e3830274db0a2db7366709cac54d4e0e8bf23223
FAQ
0
19
252
2009-10-28T07:52:59Z
Thwien
2
New page: == What are the dat-files? == A dat-file is an auto-generated file. Its contents is a serialized structure of the saved object. It will be generated while training the neural network by u...
wikitext
text/x-wiki
== What are the dat-files? ==
A dat-file is an auto-generated file. Its contents is a serialized structure of the saved object. It will be generated while training the neural network by using '''ANN_InputValue::saveToFile()''' or '''ANN_Network::saveToFile()''' and can be reloaded into the running network by '''ANN_InputValue::loadFromFile()''' or '''ANN_InputValue::loadFromFile()'''.
The dat-files are not included to the example code downloads because they are auto-generated.
To save the dat-files the directory you are saving the files should have write permission to the PHP process running.
For example if your PHP script is running as a PHP module by Apache and the Apache is running as user ''www-data'', so you can use the following code to set the permissions.
Change to the directory where your own ANN script is stored.
>cd <PROJECTDIR-OF-YOUR-ANN>
Create a subdirectory for all your dat-files.
>mkdir dats
Change the group owner of this subdirectory to ''www-data''.
>chgrp www-data dats
Change the UNIX permissions to the group for write access to this subdirectory.
>chmod g+w dats
Due to security remove all ''others'' permissions in this case.
>chmod o-rwx dats
List your permissions.
>ls -la dats
drwxrwx--- 7 user www-data 4.0K 2009-10-28 08:52 dats
Use this subdirectory in your PHP scripts:
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('dats/xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('dats/values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('dats/values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('dats/xor.dat');
$objNetwork->printNetwork();
</source>
a1daa25833c08ef4c3449345de4cc93d130c3ff6
271
252
2009-12-22T11:33:07Z
Thwien
2
/* What are the dat-files? */
wikitext
text/x-wiki
== What are the dat-files? ==
A dat-file is an auto-generated file. Its contents is a serialized structure of the saved object. It will be generated while training the neural network by using '''ANN_InputValue::saveToFile()''' or '''ANN_Network::saveToFile()''' and can be reloaded into the running network by '''ANN_InputValue::loadFromFile()''' or '''ANN_InputValue::loadFromFile()'''.
The dat-files are not included to the example code downloads because they are auto-generated.
To save the dat-files the directory you are saving the files should have write permission to the PHP process running.
For example if your PHP script is running as a PHP module by Apache and the Apache is running as user ''www-data'', so you can use the following code to set the permissions.
Change to the directory where your own ANN script is stored.
>cd <PROJECTDIR-OF-YOUR-ANN>
Create a subdirectory for all your dat-files.
>mkdir dats
Change the group owner of this subdirectory to ''www-data''.
>chgrp www-data dats
Change the UNIX permissions to the group for write access to this subdirectory.
>chmod g+w dats
Due to security remove all ''others'' permissions in this case.
>chmod o-rwx dats
List your permissions.
>ls -la dats
drwxrwx--- 7 user www-data 4.0K 2009-10-28 08:52 dats
Use this subdirectory in your PHP scripts:
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('dats/xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('dats/values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('dats/values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('dats/xor.dat');
$objNetwork->printNetwork();
</source>
22fdd2a3c73053e4b387dfd4d96371bb92cb7fac
Selling Icecreams
0
9
253
250
2009-10-28T07:56:05Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
c1cfabccabb8b46a05ac576e0d8a1f9e43fe1340
266
253
2009-12-22T11:28:42Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
e062bcaee6303453762382aba3f77324b8d78b82
267
266
2009-12-22T11:28:53Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
dc61061a3c45674fd3b3f917f3ee11193b6665cf
Installation
0
3
254
242
2009-10-28T07:56:45Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann207.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/ANN_Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
32554fed057c0d8b4b8e16ff09ed5354505f5c98
262
254
2009-12-22T11:26:42Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann210.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
914e015b0bbabf2d261619018f89e1f79208d05f
263
262
2009-12-22T11:27:20Z
Thwien
2
/* Performance issues */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann210.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once ('ANN/Loader.php');
$objANN = new ANN_Network;
?>
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
9a3b62dbfaacd00a862c5b1b8dc8cfab76acbbbd
287
263
2009-12-27T12:22:00Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Unpack the source code
>tar -xzf ann212.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann212.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
9a22bfbe82e84c8f024f6a78a3b0452e5e35b694
289
287
2009-12-27T12:56:47Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' must be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' must be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann212*
* Unpack the source code
>tar -xzf ann212.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann212.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
40f69baf37ef35fcf4bd2bfc812a879a2543f82d
292
289
2009-12-27T13:42:22Z
Thwien
2
/* Requirements */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann212*
* Unpack the source code
>tar -xzf ann212.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann212.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
f6577224ead8d6ed3256f57292c1294a6ecba3c4
298
292
2010-01-06T18:23:47Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann213*
* Unpack the source code
>tar -xzf ann213.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann213.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
f54762940b5ed3d0556104aa5ca796ff95cb869e
Logical XOR function
0
8
255
249
2009-10-28T07:57:01Z
Thwien
2
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
aef5dd26f7a2826fcefa1025b0df2d6b902da511
264
255
2009-12-22T11:27:58Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/ANN_Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
17e90828ee627c9ba1d3e14b36c8bfcde5c5caa7
265
264
2009-12-22T11:28:11Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
401da0665c3b2bdf2be317cee122af379b86c8b6
Download
0
2
258
240
2009-12-22T11:22:01Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.0''' (2009-12-22) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
9a17dcdf9dd3c9c0c6a40757ffb51d887d74fd51
274
258
2009-12-23T17:05:00Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.1''' (2009-12-23) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
d9ff7fea5af49338a45ea104b3afccf179a435f8
278
274
2009-12-23T17:17:10Z
Thwien
2
/* Version 2.1.1 (2009-12-23) stable */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.1''' (2009-12-23) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
a20787634fd5f3a3cf7573505d16de38725941d2
281
278
2009-12-26T14:50:36Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.2''' (2009-12-26) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
49f867512a0a2a83f545159d711b72ee771e6a5f
286
281
2009-12-27T12:17:17Z
Thwien
2
/* Version 2.1.2 (2009-12-26) stable */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.2''' (2009-12-26) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
ce1fd318f8a472e51abda72fc8c5fee3654c7b8d
291
286
2009-12-27T13:28:49Z
Thwien
2
/* Version 2.1.2 (2009-12-26) stable */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.2''' (2009-12-26) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
29ff98e6cf6f6f569f43bd838a075565325249e1
296
291
2010-01-06T18:19:50Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.3''' (2010-01-06) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
7df65d75aba25db7e92128ded07d813ee2cf77e9
Multilayer perceptron
0
14
261
156
2009-12-22T11:25:07Z
Thwien
2
/* Dynamic learning rate */
wikitext
text/x-wiki
== General ==
A multilayer perceptron is a feedforward artificial neural network. This means the signal inside the neural network flows from input layer passing hidden layers to output layer. While training the error correction of neural weights are done in the opposite direction. This is done by the backpropagation algorithm.
== Activation ==
At first a cumulative input is calculated by the following equation:
:<math>s = \sum^{n}_{k=1} i_{k} \cdot w_{k}</math>
Considering the ''BIAS'' value the equation is:
:<math>s = (\sum^{n}_{k=1} i_{k} \cdot w_{k}) + BIAS \cdot w_{k}</math>
:<math>BIAS = 1</math>
=== Sigmoid activation function ===
:<math>o = \rm{sig}(s) = \frac{1}{1 + \rm e^{-s}}</math>
=== Hyperbolic tangent activation function ===
:<math>o = tanh(s)</math>
using output range between -1 and 1, or
:<math>o = \frac{tanh(s) + 1}{2}</math>
using output range between 0 and 1.
:<math>s</math> cumulative input
:<math>w</math> weight of input
:<math>i</math> value of input
:<math>n</math> number of inputs
:<math>k</math> number of neuron
== Error of neural network ==
If the neural network is initialized by random weights it has of course not the expected output. Therefore training is necessary. While supervised training known inputs and their corresponded output values are presented to the network. So it is possible to compare the real output with the desired output. The error is described as the following algorithm:
:<math>E={1\over2} \sum^{n}_{i=1} (t_{i}-o_{i})^{2}</math>
:<math>E</math> network error
:<math>n</math> count of input patterns
:<math>t_{i}</math> desired output
:<math>o_{i}</math> calculated output
== Backpropagation ==
The learning algorithm of a single layer perceptron is easy compared to a multilayer perceptron. The reason is that just the output layer is directly connected to the output, but not the hidden layers. Therefore the calculation of the right weights of the hidden layers is difficult mathematically. To get the right delta value for changing the weights of hidden neuron is described in the following equation:
:<math>\Delta w_{ij}= -\alpha \cdot {\partial E \over \partial w_{ij}} = \alpha \cdot \delta_{j} \cdot x_{i}</math>
:<math>E</math> network error
:<math>\Delta w_{ij}</math> delta value <math>w_{ij}</math> of neuron connection <math>i</math> to <math>j</math>
:<math>\alpha</math> learning rate
:<math>\delta_{j}</math> the error of neuron <math>j</math>
:<math>x_{i}</math> input of neuron <math>i</math>
:<math>t_{j}</math> desired output of output neuron <math>j</math>
:<math>o_{j}</math> real output of output neuron <math>j</math>.
== Programming solution of backpropagation ==
In this PHP implementation of multilayer perceptron the following algorithm is used for weight changes in hidden layers and output layer.
=== Weight change of output layer ===
:<math>\Delta w_{k} = o_{k} \cdot (a_{k} - o_{k}) \cdot (1 - o_{k})</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>k</math> neuron k
:<math>o</math> output
:<math>i</math> input
:<math>a</math> desired output
:<math>w</math> weight
:<math>m</math> weight m
=== Weight change of hidden layers ===
:<math>s_{kl} = \sum^{n}_{l=1} w_{k} \cdot \Delta w_{l} \cdot \beta</math>
:<math>\Delta w_{k} = o_{k} \cdot (1 - o_{k}) \cdot s_{kl}</math>
:<math>w_{mk} = w_{mk} + \alpha \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>k</math> neuron k
:<math>l</math> neuron l
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
:<math>o</math> output
:<math>n</math> count of neurons
=== Momentum ===
To avoid oscillating weight changes the momentum factor <math>\beta</math> is defined. Therefore the calculated weight change would not be the same always.
=== Overfitting ===
To avoid overfitting of neural networks in this PHP implementation the training procedure is finished if real output value has a fault tolerance of 1 per cent of desired output value.
=== Choosing learning rate and momentum ===
The proper choosing of learning rate (<math>\alpha</math>) and momentum (<math>\beta</math>) is done by experience. Both values have a range between 0 and 1. This PHP implementation uses a default value of 0.5 for <math>\alpha</math> and 0.95 for <math>\beta</math>. <math>\alpha</math> and <math>\beta</math> cannot be zero. Otherwise no weight change will be happen and the network would never reach an errorless level. Theses factors can be changed by runtime.
=== Dynamic learning rate ===
To convergent the network faster to its lowest error, use of dynamic learning rate may be a good way.
:<math>w_{mk} = w_{mk} + \alpha \cdot \gamma \cdot i_{m} \cdot \Delta w_{k}</math>
:<math>\alpha \cdot \gamma = [0.5 .. 0.9]</math>
:<math>\alpha</math> learning rate
:<math>\beta</math> momentum
:<math>\gamma</math> dynamic learning rate factor
:<math>k</math> neuron k
:<math>w</math> weight
:<math>m</math> weight m
:<math>i</math> input
=== Weight decay ===
Normally weights grow up to large numbers. But in fact this is not necessary. The weight decay algorithm tries to avoid large weights. Through large weights maybe the network convergence takes too long.
The weight change algorithm without weight decay is the following:
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)}</math>
By subtracting a value the weight change will be reduce in relation to the last weight.
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E(t)}{\part w_{i}(t)} - \lambda \cdot w_{i}(t-1)</math>
:<math>\lambda = [0.03 .. 0.05]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
:<math>\lambda</math> weight decay factor
=== Quick propagation algorithm ===
The Quickprop algorithm calculates the weight change by using the quadratic function <math>f(x) = x^2</math>. Two different error values of two different weights are the two points of a secant. Relating this secant to a quadratic function it is possible to calculate its minimum <math>f'(x) = 0</math>. The x-coordinate of the minimum point is the new weight value.
:<math>S(t) = \frac{\part E}{\part w_{i}(t)}</math>
:<math>\Delta w_{i}(t) = \alpha \cdot \frac{\part E}{\part w_{i}(t)}</math> (normal backpropagation)
:<math>\frac{\Delta w_{i}(t)}{\alpha} = \frac{\part E}{\part w_{i}(t)}</math>
:<math>S(t) = \frac{\part E}{\part w_{i}(t)} = \frac{\Delta w_{i}(t)}{\alpha}</math>
:<math>\Delta w_{i}(t) = \frac{S(t)}{S(t-1) - S(t)} \cdot \Delta w_{i}(t-1)</math> (quick propagation)
:<math>w</math> weight
:<math>i</math> neuron
:<math>E</math> error function
:<math>t</math> time (training step)
:<math>\alpha</math> learning rate
To avoid too big changes the maximum weight change is limited by the following equation:
:<math>\Delta w_{i}(t) \leq \mu \cdot \Delta w_{i}(t-1)</math>
:<math>\mu = [1.75 .. 2.25]</math>
:<math>w</math> weight
:<math>i</math> neuron
:<math>t</math> time (training step)
:<math>\mu</math> maximal weight change factor
=== RProp (Resilient Propagation) ===
The RProp algorithm just refers to the direction of the gradient.
<math>\Delta w_{ij}(t) = \begin{cases}
-\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} > 0 \\
+\Delta p_{ij}, & \text{if } \frac{\part E}{\part w_{ij}} < 0 \\
0, & \text{if } \frac{\part E}{\part w_{ij}} = 0
\end{cases}</math>
<math>\Delta p_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\alpha^- \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
:<math>\alpha</math> learning rate
:<math>w</math> weight
:<math>p</math> weight change
:<math>\alpha^+ = 1.2</math>
:<math>\alpha^- = 0.5</math>
:<math>\Delta w(0) = 0.5</math>
:<math>\Delta w(t)_{max} = 50</math>
:<math>\Delta w(t)_{min} = 0</math>
=== RProp+ ===
The RProp+ algorithm reduce the previous weight change from the last weight change if the mathematical sign of the gradient changes.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
=== iRProp+ ===
The iRProp+ is a improve RProp+ algorithm with a little change. Before reducing the previous weight change from the last weight change, the network error will be calculated and compared. If the network error increases from <math>E(t-2)</math> to <math>E(t-1)</math>, then the procedure of RProp+ will be done. Otherwise no change will be done, because if <math>E(t-1)</math> has a lower value than <math>E(t-2)</math> the weight change seems to be correct to convergent the neural network.
<math>\Delta w_{ij}(t) = \begin{cases}
\alpha^+ \cdot \Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) > 0 \\
\Delta w_{ij}(t-1) - \Delta w_{ij}(t-2), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) < 0 \text{ and if } E(t) > E(t-1) \\
\Delta w_{ij}(t-1), & \text{if } \frac{\part E}{\part w_{ij}}(t-1) \cdot \frac{\part E}{\part w_{ij}}(t) = 0
\end{cases}</math>
== Binary and linear input ==
If binary input is used easily the input value is 0 for ''false'' and 1 for ''true''.
:<math>0 : False</math>
:<math>1 : True</math>
Using linear input values normalization is needed:
:<math>i = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>i</math> input value for neural network
:<math>f</math> real world value
This PHP implementation is supporting input normalization.
== Binary and linear output ==
The interpretation of output values just makes sense for the output layer. The interpretation is depending on the use of the neural network. If the network is used for classification, so binary output is used. Binary has two states: True or false. The network will produce always linear output values. Therefore these values has to be converted to binary values:
:<math>o < 0.5 : False</math>
:<math>o >= 0.5 : True</math>
:<math>o</math> output value
If using linear output the output values have to be normalized to a real value the network is trained for:
:<math>f = o \cdot (f_{max} - f_{min}) + f_{min}</math>
:<math>f</math> real world value
:<math>o</math> real output value of neural network
The same normalization equation for input values is used for output values while training the network.
:<math>o = \frac{f - f_{min}}{f_{max} - f_{min}}</math>
:<math>o</math> desired output value for neural network
:<math>f</math> real world value
This PHP implementation is supporting output normalization.
3029ea1ee0c25f150fc46a58096cc62ee7763778
Visual network topoloy
0
10
268
227
2009-12-22T11:29:12Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$objNetworkImage = new ANN_NetworkGraph($objNetwork);
$objNetworkImage->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
e90adfd6a097fb8a65693948d66f66e154deaeb0
Logging network weights
0
12
269
212
2009-12-22T11:29:39Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
2a1acb3623fd342f02c5c10b39071f3a6fe8faa6
Client-Server model
0
15
270
210
2009-12-22T11:30:10Z
Thwien
2
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
class ANN_MyServer extends ANN_Server
{
// ****************************************************************************
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
// ****************************************************************************
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
86f9a540d1e719699b66e3434d071a4a03dd2e62
Copyright
0
4
272
144
2009-12-22T11:33:37Z
Thwien
2
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<code>
* Artificial Neural Network - Version 2.1
*
* For updates and changes visit the project page at http://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* This source file is freely re-distributable, with or without modifications
* provided the following conditions are met:
*
* 1. The source files must retain the copyright notice below, this list of
* conditions and the following disclaimer.
*
* 2. The name of the author must not be used to endorse or promote products
* derived from this source file without prior written permission. For
* written permission, please contact me.
*
* <b>DISCLAIMER</b>
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR `AS IS'' AND
* ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
* PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE PHP
* AUTHOR OR HIS CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT,
* STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
* OF THE POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version 2.1 by Thomas Wien
* @copyright Copyright (c) 2002 by Eddy Young
* @copyright Copyright (c) 2007-08 by Thomas Wien
* @package ANN
</code>
f6e05ed485f454cf3deb432db0c59274a6a9a492
Examples
0
7
276
121
2009-12-23T17:07:44Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
* [[Lottery]]
== Strings ==
* [[Detection of language]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
* [[Client-Server model]]
bad8cc02bffeeb24fefba2033b1b9e4e2c293568
280
276
2009-12-26T14:13:11Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
* [[Lottery]]
== Strings ==
* [[Detection of language]]
== Classification ==
* [[Detection of language with classification]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
* [[Client-Server model]]
8884431211ba731231b2ba7f00ac8356d5253f74
299
280
2010-01-06T18:26:51Z
Thwien
2
wikitext
text/x-wiki
== Logical Functions ==
Training an artificial neural network to learn logical functions is just interesting in learning the use of such a network, but not for practical use. The only interesting thing behind learning the XOR function is that in history of development of neural networks it was figured out the XOR function cannot be learned by just one neuron. But in the past it was quite difficult mathematically to find a solution to connect a few neurons together.
* [[logical XOR function]]
* logical OR function
* logical AND function
== Prediction ==
One benefit of multilayer perceptron is the possibility of prediction.
* [[Selling Icecreams]]
* [[Daily power consumption]]
* [[Lottery]]
== Strings ==
* [[Detection of language]]
== Classification ==
* [[Detection of language with classification]]
== Input Support Classes ==
* [[Using date input support class]]
== Optimization ==
* [[Internet routing decision]]
== Several functions of ANN ==
* [[Visual network topoloy]]
* [[Logging network weights]]
* [[Client-Server model]]
2df4ac53ab73856d6ab6ba2ef9e6a770bb5ca570
Detection of language
0
20
277
2009-12-23T17:15:50Z
Thwien
2
New page: == FAQ == For information about dat-files have a view to the [[FAQ]] page. == Training == <source lang="php"> require_once '../ANN/Loader.php'; try { $network = ANN_Network::loadFro...
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$network = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$network = new ANN_Network(1, 8, 2);
$objStringValues = new ANN_StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new ANN_Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output(1, 0) // German
->input($objStringValues->getInputValue('Hello World'))
->output(0, 1); // English
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$network->setValues($objValues);
$network->train();
$network->saveToFile('strings.dat');
$network->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$network = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = ANN_StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$network->setValues($objValues);
$network->printNetwork();
</source>
be2583bc46b46421c89287f00540a1c172803256
279
277
2009-12-26T14:03:06Z
Thwien
2
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(1, 8, 2);
$objStringValues = new ANN_StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new ANN_Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output(1, 0) // German
->input($objStringValues->getInputValue('Hello World'))
->output(0, 1); // English
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = ANN_StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
</source>
61359c5a8e45951425cbe4117c3c394e7a7efc35
Detection of language with classification
0
21
285
2009-12-26T15:42:01Z
Thwien
2
New page: == FAQ == For information about dat-files have a view to the [[FAQ]] page. == Training == <source lang="php"> require_once '../ANN/Loader.php'; try { $objNetwork = ANN_Network::load...
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objClassification = new ANN_Classification(2); // As of ANN 2.1.2
$objClassification->addClassifier('german');
$objClassification->addClassifier('english');
$objClassification->saveToFile('classifiers_strings.dat');
$objNetwork = new ANN_Network(1, 8, 2);
$objStringValues = new ANN_StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new ANN_Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output($objClassification->getOutputValue('german'))
->input($objStringValues->getInputValue('Hello World'))
->output($objClassification('english')); // As of PHP 5.3.0
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = ANN_StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
try
{
$objClassification = ANN_Classification::loadFromFile('classifiers_strings.dat');
}
catch(Exception $e)
{
die('Loading of classification failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
print_r($objClassification->getRealOutputValue($arrOutput));
</source>
0e35d5808a0b53ab0374c7464bb55cf8ccc28d9a
File:Donate.gif
6
22
293
2009-12-28T17:38:35Z
Thwien
2
wikitext
text/x-wiki
da39a3ee5e6b4b0d3255bfef95601890afd80709
Using date input support class
0
23
300
2010-01-06T18:33:26Z
Thwien
2
New page: == FAQ == For information about dat-files have a view to the [[FAQ]] page. == Training == <source lang="php"> require_once 'ANN/Loader.php'; try { $objNetwork = ANN_Network::loadFro...
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objDateInput = new ANN_DateInputs('2010-01-03'); // As of ANN 2.1.3
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10),
$objDateInput->getHolidaysInWeek() // As of ANN 2.1.3
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // As of ANN 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
eca9fad5f609769df4397716aa1f2ae2b0a2093a
Using date input support class
0
23
301
300
2010-01-06T18:37:14Z
Thwien
2
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== XML for holidays ==
The standard filename is Holidays.xml but can be change by calling ANN_DateInputs::setHolidaysFilename().
<source lang="xml">
<?xml version="1.0" encoding="UTF-8"?>
<holidays>
<holiday>
<day>1</day>
<month>1</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>New Year</description>
</holiday>
<holiday>
<day>24</day>
<month>12</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>Christmas</description>
</holiday>
</holidays>
</source>
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objDateInput = new ANN_DateInputs('2010-01-03'); // As of ANN 2.1.3
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10),
$objDateInput->getHolidaysInWeek() // As of ANN 2.1.3
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // As of ANN 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
1a3f1698cb270249434c16b4d6d1156762b5e9d8
302
301
2010-01-06T18:52:45Z
Thwien
2
/* XML for holidays */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== XML for holidays ==
The standard filename is ''Holidays.xml'' but can be changed by calling ANN_DateInputs::setHolidaysFilename().
<source lang="xml">
<?xml version="1.0" encoding="UTF-8"?>
<holidays>
<holiday>
<day>1</day>
<month>1</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>New Year</description>
</holiday>
<holiday>
<day>24</day>
<month>12</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>Christmas</description>
</holiday>
</holidays>
</source>
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objDateInput = new ANN_DateInputs('2010-01-03'); // As of ANN 2.1.3
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10),
$objDateInput->getHolidaysInWeek() // As of ANN 2.1.3
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // As of ANN 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
ccd80ce52181c36a551c359fa02f62c3904f8ab8
303
302
2010-01-06T18:53:40Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== XML for holidays ==
The standard filename is ''Holidays.xml'' but can be changed by calling ANN_DateInputs::setHolidaysFilename().
<source lang="xml">
<?xml version="1.0" encoding="UTF-8"?>
<holidays>
<holiday>
<day>1</day>
<month>1</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>New Year</description>
</holiday>
<holiday>
<day>24</day>
<month>12</month>
<year>any</year>
<country>Germany</country>
<state>any</state>
<description>Christmas</description>
</holiday>
</holidays>
</source>
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 8, 1);
$objDateInput = new ANN_DateInputs('2010-01-03'); // As of ANN 2.1.3
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10),
$objDateInput->getHolidaysInWeek() // As of ANN 2.1.3
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20),
$objDateInput->getHolidaysInWeek()
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // As of ANN 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
0f9e09301eeb938f9b7773fef1b26701c609be60
Installation
0
3
304
298
2010-06-15T00:04:00Z
Asdasd man
19
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann213*
* Unpack the source code
>tar -xzf ann213.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann213.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini file
00f543ba34790d12db72113eacec59e19460b202
311
304
2011-02-10T12:32:49Z
Thwien
2
/* Performance issues */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann213*
* Unpack the source code
>tar -xzf ann213.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann213.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini file
* Use PHP 5.3.x
fb974c0b0c9ff34ec0de21fc4de6ab2469ed4e80
339
311
2011-05-23T21:26:09Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann214*
* Unpack the source code
>tar -xzf ann214.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann214.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini file
* Use PHP 5.3.x
2346413f5af4b6013ce07d2a4aef27def8aa258e
340
339
2011-05-23T21:31:48Z
Thwien
2
/* Performance issues */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann214*
* Unpack the source code
>tar -xzf ann214.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann214.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
77623943aa2860db4f95c1a39dc6f53c88a0226c
343
340
2011-05-24T18:53:12Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.2.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann215*
* Unpack the source code
>tar -xzf ann215.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann215.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
a3e23fe3b2ee98edcaa315766fb4868f89a2285a
346
343
2011-06-01T11:22:42Z
Thwien
2
/* Requirements */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann215*
* Unpack the source code
>tar -xzf ann215.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
$objNetwork = new ANN_Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann215.phar.gz';
$objNetwork = new ANN_Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
eb970b341e01c6cfb4dbb368667b46f52c09f930
347
346
2011-06-01T11:25:38Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann220*
* Unpack the source code
>tar -xzf ann220.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann220.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
7c8d8c4e1fd06877862d71c7defd6c20886bffb8
Talk:Main Page
1
24
305
2010-06-24T22:11:52Z
Linkble
22
New page: hi, after reading all the page of this site i can find your contact mail. Can you give one or maybe help me making the icecream example working. I've tried all possible settings and tried...
wikitext
text/x-wiki
hi,
after reading all the page of this site i can find your contact mail. Can you give one or maybe help me making the icecream example working.
I've tried all possible settings and tried launching the script a lot of time but impossible to have it completely trained or at least good results...
Thx
9c69246e4fea16472613e32398bf148ab3551764
306
305
2010-06-29T17:39:47Z
Thwien
2
wikitext
text/x-wiki
hi,
after reading all the page of this site i can find your contact mail. Can you give one or maybe help me making the icecream example working.
I've tried all possible settings and tried launching the script a lot of time but impossible to have it completely trained or at least good results...
Thx
== ==
Do you run this script on windows? Please, try another php version and maybe on a Linux/UNIX system if possible. If you still have the same problem, send your phpinfo output via mail at info_at_thwien_de and your script running the ANN-Class. And also send an output of ANN_Network::printNetwork() after running the script the first run.
45568f8dc6d637af53ec06fb17b05560dfb04a5f
309
306
2011-02-07T21:07:33Z
Thwien
2
/* PHP 5.3.1 + MediaWiki */ new section
wikitext
text/x-wiki
hi,
after reading all the page of this site i can find your contact mail. Can you give one or maybe help me making the icecream example working.
I've tried all possible settings and tried launching the script a lot of time but impossible to have it completely trained or at least good results...
Thx
== ==
Do you run this script on windows? Please, try another php version and maybe on a Linux/UNIX system if possible. If you still have the same problem, send your phpinfo output via mail at info_at_thwien_de and your script running the ANN-Class. And also send an output of ANN_Network::printNetwork() after running the script the first run.
== PHP 5.3.1 + MediaWiki ==
There was a bug in PHP 5.3.1 causing Wiki not to work properly while saving page edits. Sorry, if anyone wanted to edit some pages, but was rejected by an error message. Now PHP 5.3.5 is running. So, this bug is fixed. Please, feel free to continue editing this wiki pages.
3a4e630408e877ebb3fa635a81c7c66693ba9ba8
Main Page
0
1
307
297
2011-02-07T20:49:50Z
Thwien
2
/* Overview */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.4 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.3 by Thomas Wien''' (2010-01-06) [[Download]]
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
935359a3d716a9223bea54691d3905cfa4492152
335
307
2011-04-23T17:35:08Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.4 by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.3 by Thomas Wien''' (2010-01-06) [[Download]]
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
a9442261f3a1b2f12d7dc9f0d48dfb16afcc9eb2
337
335
2011-05-23T21:17:25Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.4 by Thomas Wien''' (2011-05-23) [[Download]]
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
1fc58bfae910f3233977e5a120cfee9619b62e87
342
337
2011-05-24T18:52:17Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.1.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.1.5 by Thomas Wien''' (2011-05-24) [[Download]]
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
f2c4ae780606930d1c61ece05a43a53bd1d8c574
345
342
2011-06-01T11:22:13Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.0 by Thomas Wien''' (2011-06-01) [[Download]]
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
'''Version 2.1.5 by Thomas Wien''' (2011-05-24) [[Download]]
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Supporting PHP 5.3 namespaces (later)
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
0d14a2dc01e068645aed8b601512cfa304c17cd4
348
345
2011-06-01T11:26:25Z
Thwien
2
/* Todo */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.0 by Thomas Wien''' (2011-06-01) [[Download]]
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
'''Version 2.1.5 by Thomas Wien''' (2011-05-24) [[Download]]
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
2958767e2eb328444fe5f5df078ed92669aadf5f
368
348
2011-06-01T14:31:16Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.0 by Thomas Wien''' (2011-06-01) [[Download]]
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24) [[Download]]
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
6d175be19e6ede90afd334609307c93492f9e3c5
371
368
2011-06-01T14:48:34Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.0 by Thomas Wien''' (2011-06-01) [[Download]]
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.6 by Thomas Wien''' (2011-06-01) [[Download]]
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
d16f2eedd2088198c416a5e0c298e666d767611b
Development
0
25
308
2011-02-07T21:03:11Z
Thwien
2
Created page with "<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big> This page offers to participate on the development of ANN implementation for PHP 5.x. Please feel free to clone this..."
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to participate on the development of ANN implementation for PHP 5.x. Please feel free to clone this git repository, do enhancements on the code and send your repository or patch files to info_at_thwien_dot_de.
== Public ANN Git Repository ==
>git clone http://ann.thwien.de/ann.git
== Rules for developers ==
* Write code, don't copy it from others
* Consider the licence
* Use the same code standard
* Test your code changes before sending patches
* Include examples to the example directory
* Use php doc comments
f4772d0917a81204961c1d1b1d66b131cfae9441
Client-Server model
0
15
310
270
2011-02-08T21:05:20Z
Thwien
2
/* Server implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
class ANN_MyServer extends ANN_Server
{
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
}
$objServer = new ANN_MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
be488e6c372bafdc18cf21c206aebae0a5905c6c
359
310
2011-06-01T12:16:19Z
Thwien
2
/* Server implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Server;
class MyServer extends Server
{
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
}
$objServer = new MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = new ANN_Network;
$objValues = new ANN_Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof ANN_Network)
$objNetwork->printNetwork();
</source>
5b486fce6491f8d658c0df9c172b55375db93ba0
360
359
2011-06-01T12:17:29Z
Thwien
2
/* Client implementation */
wikitext
text/x-wiki
== Server implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Server;
class MyServer extends Server
{
/**
* @param string $strUsername
* @param string $strPassword
* @return boolean
*/
protected function checkLogin($strUsername, $strPassword)
{
// User-defined authentication by database for example
return ($strUsername == 'username' && $strPassword == 'password');
}
}
$objServer = new MyServer;
</source>
== Client implementation ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
}
catch(Exception $e)
{
die('Network could not be created');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork = $objNetwork->trainByHost(
'username',
'password',
'http://example.tld/ANN_Server.php'
);
if($objNetwork instanceof Network)
$objNetwork->printNetwork();
</source>
a43ba93bc50ced96596cb00ed8e384df807c4342
Download
0
2
336
296
2011-05-23T21:15:24Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.4''' (2011-05-23) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
df85d13774f485f12fa830e1a638f94de7c0394a
341
336
2011-05-24T18:50:34Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.1.5''' (2011-05-24) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
4b37b0b25acdfe877832bfb6695497e1ea22093c
344
341
2011-06-01T11:19:08Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''stable''''' (PHP 5.3 or above) ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (20 KB)
'''MD5 finger prints'''
* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
== Version '''2.1.5''' (2011-05-24) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
5b6d989948fc389afeda30a9bebd48258dc40688
363
344
2011-06-01T13:14:58Z
Thwien
2
/* Version 2.1.5 (2011-05-24) stable */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''stable''''' (PHP 5.3 or above) ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (20 KB)
'''MD5 finger prints'''
* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
== Version '''2.1.5''' (2011-05-24) '''''stable''''' (PHP 5.2) ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
3072118df2339521d06ebc87bac6cea56e13c6fc
364
363
2011-06-01T13:18:38Z
Thwien
2
/* Version 2.2.0 (2011-06-01) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''stable''''' (PHP 5.3 or above) ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
== Version '''2.1.5''' (2011-05-24) '''''stable''''' (PHP 5.2) ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''stable''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''obsolete''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
3a7e653e5d637e218af0f77eb2db3908f740c34d
365
364
2011-06-01T13:50:13Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
38be36a8a185476b770edfc3bf402ce82244736e
366
365
2011-06-01T14:28:48Z
Thwien
2
/* Version 2.2.0 (2011-06-01) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* 1de88cd1077247d3c43ef37c99a9486b ann220.phar.gz
* 87513d54d7558230c496cba478e585fd ann220.tar.gz
* 2f4ea0dea8ca91d6d33b344268f637d2 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
ab4ae62f12f58bf332c292d3c8fb178385c68cae
367
366
2011-06-01T14:29:28Z
Thwien
2
/* Version 2.2.0 (2011-06-01) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
5dc152b85ff4aa463991fb05b02353e663dd06a1
369
367
2011-06-01T14:45:02Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
93b4c331dace79bdfda2b6840494827a37d51423
370
369
2011-06-01T14:46:41Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
5251bf175a6e05e40d6365084d15460e6848bf04
373
370
2011-06-15T07:06:51Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7f5327d35d7b6f0dec2cee5065ea4a70 ann221.phar.gz
* 4c7a2ec74b1c9c14bd1d1fad8d71d62d ann221.tar.gz
* e6c90eb4422678306480be2cf2fca7b6 ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* 1bdbe4a8dcafb50c7e95289807b77b71 ann217.phar.gz
* 1ce821177150650f4e0675f02c33ad05 ann217.tar.gz
* d88ebee3d476bf488dec448d38139688 ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
fcc9205f5b8b11106db40f35b13ef0a7b7016686
Selling Icecreams
0
9
338
267
2011-05-23T21:20:59Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new ANN_Network(2, 5, 1);
$objTemperature = new ANN_InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new ANN_InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new ANN_OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new ANN_Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
d368d9f0b9d6757231f1723aa7980a0a5686e012
351
338
2011-06-01T11:31:40Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\InputValue;
use ANN\OutputValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network(2, 5, 1);
$objTemperature = new InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = ANN_InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = ANN_InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = ANN_OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = ANN_Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
f165c6fcf74d9ce4925c391ec93aa0f63aff879f
352
351
2011-06-01T11:33:56Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\InputValue;
use ANN\OutputValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network(2, 5, 1);
$objTemperature = new InputValue(-15, 50); // Temperature in Celsius
$objTemperature->saveToFile('input_temperature.dat');
$objHumidity = new InputValue(0, 100); // Humidity percentage
$objHumidity->saveToFile('input_humidity.dat');
$objIcecream = new OutputValue(0, 300); // Quantity of sold ice-creams
$objIcecream->saveToFile('output_quantity.dat');
$objValues = new Values;
$objValues->train()
->input(
$objTemperature->getInputValue(20),
$objHumidity->getInputValue(10)
)
->output(
$objIcecream->getOutputValue(20)
)
->input(
$objTemperature->getInputValue(30),
$objHumidity->getInputValue(40)
)
->output(
$objIcecream->getOutputValue(90)
)
->input(
$objTemperature->getInputValue(32),
$objHumidity->getInputValue(30)
)
->output(
$objIcecream->getOutputValue(70)
)
->input(
$objTemperature->getInputValue(33),
$objHumidity->getInputValue(20)
)
->output(
$objIcecream->getOutputValue(75)
);
$objValues->saveToFile('values_icecreams.dat');
unset($objValues);
unset($objTemperature);
unset($objHumidity);
unset($objIcecream);
}
try
{
$objTemperature = InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('icecreams.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\InputValue;
use ANN\OutputValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('icecreams.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objTemperature = InputValue::loadFromFile('input_temperature.dat'); // Temperature in Celsius
$objHumidity = InputValue::loadFromFile('input_humidity.dat'); // Humidity percentage
$objIcecream = OutputValue::loadFromFile('output_quantity.dat'); // Quantity of sold ice-creams
}
catch(Exception $e)
{
die('Error loading value objects');
}
try
{
$objValues = Values::loadFromFile('values_icecreams.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input( // input values appending the loaded ones
$objTemperature->getInputValue(17),
$objHumidity->getInputValue(12)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(42)
)
->input(
$objTemperature->getInputValue(31),
$objHumidity->getInputValue(34)
)
->input(
$objTemperature->getInputValue(34),
$objHumidity->getInputValue(21)
);
$objNetwork->setValues($objValues);
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
foreach($arrOutput as $floatOutput)
print $objIcecream->getRealOutputValue($floatOutput). '<br />';
</source>
3b4b61c0106188d10f17fb89faa4c3d4b860e55a
Logical XOR function
0
8
349
265
2011-06-01T11:28:44Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = ANN_Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
61594b32d4aa27c86ced2f3676779d09cd57f2bf
350
349
2011-06-01T11:29:33Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('xor.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
try
{
$objValues = Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objValues->input(0, 1) // input values appending the loaded ones
->input(1, 1)
->input(1, 0)
->input(0, 0)
->input(0, 1)
->input(1, 1);
$objNetwork->setValues($objValues);
print_r($objNetwork->getOutputs());
</source>
1e57c5ddf1141961bf1b16ceabc6fb2c97db6afc
Detection of language
0
20
353
279
2011-06-01T11:35:35Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network(1, 8, 2);
$objStringValues = new StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output(1, 0) // German
->input($objStringValues->getInputValue('Hello World'))
->output(0, 1); // English
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = ANN_StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
</source>
da2f367d9b430e75f6193081d0011cc47c3465e6
354
353
2011-06-01T11:39:32Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network(1, 8, 2);
$objStringValues = new StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output(1, 0) // German
->input($objStringValues->getInputValue('Hello World'))
->output(0, 1); // English
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
</source>
e066c56cd58914fc27710fa8537ea1265e9a009c
Detection of language with classification
0
21
355
285
2011-06-01T12:10:16Z
Thwien
2
/* Training */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\Classification;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objClassification = new Classification(2); // As of ANN 2.1.2
$objClassification->addClassifier('german');
$objClassification->addClassifier('english');
$objClassification->saveToFile('classifiers_strings.dat');
$objNetwork = new Network(1, 8, 2);
$objStringValues = new StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output($objClassification->getOutputValue('german'))
->input($objStringValues->getInputValue('Hello World'))
->output($objClassification('english')); // As of PHP 5.3.0
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
try
{
$objNetwork = ANN_Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = ANN_Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = ANN_StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
try
{
$objClassification = ANN_Classification::loadFromFile('classifiers_strings.dat');
}
catch(Exception $e)
{
die('Loading of classification failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
print_r($objClassification->getRealOutputValue($arrOutput));
</source>
3b2df7f8b1379b992c2e21a112e2aa41c4aa538e
356
355
2011-06-01T12:11:57Z
Thwien
2
/* Using trained network */
wikitext
text/x-wiki
== FAQ ==
For information about dat-files have a view to the [[FAQ]] page.
== Training ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\Classification;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objClassification = new Classification(2); // As of ANN 2.1.2
$objClassification->addClassifier('german');
$objClassification->addClassifier('english');
$objClassification->saveToFile('classifiers_strings.dat');
$objNetwork = new Network(1, 8, 2);
$objStringValues = new StringValue(15); // As of ANN 2.1.1
$objStringValues->saveToFile('input_strings.dat');
$objValues = new Values;
$objValues->train()
->input($objStringValues->getInputValue('Hallo Welt'))
->output($objClassification->getOutputValue('german'))
->input($objStringValues->getInputValue('Hello World'))
->output($objClassification('english')); // As of PHP 5.3.0
$objValues->saveToFile('values_strings.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues);
$objNetwork->train();
$objNetwork->saveToFile('strings.dat');
$objNetwork->printNetwork();
</source>
== Using trained network ==
<source lang="php">
require_once '../ANN/Loader.php';
use ANN\Network;
use ANN\Classification;
use ANN\StringValue;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('strings.dat');
}
catch(Exception $e)
{
print 'Network cannot be loaded';
}
try
{
$objValues = Values::loadFromFile('values_strings.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
try
{
$objStringValues = StringValue::loadFromFile('input_strings.dat');
}
catch(Exception $e)
{
die('Loading of input values failed');
}
try
{
$objClassification = Classification::loadFromFile('classifiers_strings.dat');
}
catch(Exception $e)
{
die('Loading of classification failed');
}
$objValues->input($objStringValues->getInputValue('HAllo Welt'));
$objValues->input($objStringValues->getInputValue('Hello World'));
$objValues->input($objStringValues->getInputValue('Hálló Wélt'));
$objValues->input($objStringValues->getInputValue('Hélló Wórld'));
$objValues->input($objStringValues('Hßllo Welt')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hßlló Wórld')); // As of PHP 5.3.0
$objValues->input($objStringValues('Hallo Welt!')); // As of PHP 5.3.0
$objValues->input($objStringValues('Helló Wórld!')); // As of PHP 5.3.0
$objNetwork->setValues($objValues);
$objNetwork->printNetwork();
$arrOutputs = $objNetwork->getOutputs();
foreach($arrOutputs as $arrOutput)
print_r($objClassification->getRealOutputValue($arrOutput));
</source>
2f1e2746fe9edaa5fbf8f184825ed585bb482baa
Visual network topoloy
0
10
357
268
2011-06-01T12:12:47Z
Thwien
2
/* PNG image of network topology */
wikitext
text/x-wiki
== PNG image of network topology ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\NetworkGraph;
try
{
$objNetwork = Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
die('Network not found');
}
$objNetworkImage = new NetworkGraph($objNetwork);
$objNetworkImage->saveToFile('network.png');
</source>
== Output ==
[[Image:network.png|800px|Image of network toplogy]]
2bb767e60d4aa891dffc75fffd9d8f1e631a3e2f
Logging network weights
0
12
358
269
2011-06-01T12:13:55Z
Thwien
2
/* Logging network weights while training */
wikitext
text/x-wiki
== Logging network weights while training ==
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('values_xor.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$objNetwork->logWeightsToFile('network.csv'); // Start logging
$objNetwork->train();
$objNetwork->saveToFile('xor.dat');
</source>
2b63e5b0af68cd3a8b98685065efcc8702b47f7d
Copyright
0
4
361
272
2011-06-01T12:19:11Z
Thwien
2
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<code>
* Artificial Neural Network - Version 2.2
*
* For updates and changes visit the project page at http://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* The BSD 2-Clause License
*
* http://opensource.org/licenses/bsd-license.php
*
* Copyright (c) 2002, Eddy Young
* Copyright (c) 2007 - 2011, Thomas Wien
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version 2.2 by Thomas Wien
* @copyright Copyright (c) 2002 by Eddy Young
* @copyright Copyright (c) 2007-2011 by Thomas Wien
* @package ANN
</code>
a902b2ddd9ac8fed56d370872c403ceaa12d5040
FAQ
0
19
362
271
2011-06-01T12:24:35Z
Thwien
2
/* What are the dat-files? */
wikitext
text/x-wiki
== What are the dat-files? ==
A dat-file is an auto-generated file. Its contents is a serialized structure of the saved object. It will be generated while training the neural network by using '''\ANN\InputValue::saveToFile()''' or '''\ANN\Network::saveToFile()''' and can be reloaded into the running network by '''\ANN\InputValue::loadFromFile()''' or '''\ANN\InputValue::loadFromFile()'''.
The dat-files are not included to the example code downloads because they are auto-generated.
To save the dat-files the directory you are saving the files should have write permission to the PHP process running.
For example if your PHP script is running as a PHP module by Apache and the Apache is running as user ''www-data'', so you can use the following code to set the permissions.
Change to the directory where your own ANN script is stored.
>cd <PROJECTDIR-OF-YOUR-ANN>
Create a subdirectory for all your dat-files.
>mkdir dats
Change the group owner of this subdirectory to ''www-data''.
>chgrp www-data dats
Change the UNIX permissions to the group for write access to this subdirectory.
>chmod g+w dats
Due to security remove all ''others'' permissions in this case.
>chmod o-rwx dats
List your permissions.
>ls -la dats
drwxrwx--- 7 user www-data 4.0K 2009-10-28 08:52 dats
Use this subdirectory in your PHP scripts:
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('dats/xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('dats/values_xor.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('dats/values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('dats/xor.dat');
$objNetwork->printNetwork();
</source>
971951f02cbdd61ad24fc8328ee29c3f75110e78
MediaWiki:Sidebar
8
49
372
2011-06-06T19:47:48Z
Administrator
1
Created page with "* navigation ** mainpage|mainpage-description ** Neural_Networks|Neural Networks ** Multilayer_perceptron|Multilayer perceptron ** Download|Download ** Installation|Installation ..."
wikitext
text/x-wiki
* navigation
** mainpage|mainpage-description
** Neural_Networks|Neural Networks
** Multilayer_perceptron|Multilayer perceptron
** Download|Download
** Installation|Installation
** Examples|Examples
** FAQ|FAQ
<!-- ** portal-url|portal -->
<!-- ** currentevents-url|currentevents -->
** recentchanges-url|recentchanges
** randompage-url|randompage
** helppage|help
* SEARCH
* TOOLBOX
* LANGUAGES
56ce889f4840913bb4e52a980e76d323697e6cc2
Main Page
0
1
374
371
2011-06-15T07:08:30Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.1 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
11073497b04af169c859f0024dac610d90720441
379
374
2011-06-15T07:56:44Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.1 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
706dd2db32ab1a629be0e404224edf60829e1b88
384
379
2011-07-04T08:25:47Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.2 by Thomas Wien''' (2011-07-04) [[Download]]
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
a64ffd32c16600f35e8237c14f471de2bb410a2b
388
384
2012-12-12T21:00:14Z
Thwien
2
/* Versions and Change-Log */ New Version
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
d459500532f680521c7bda1ea69f5708e3d3db60
394
388
2012-12-13T21:21:42Z
Thwien
2
/* Versions and Change-Log */
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Donate ==
If you want to support the current and future development
of this project I would appreciate if you donate a freely amount
via paypal.
[http://ann.thwien.de/donate/donate.html Donate now]
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
abad5606db3f2a9de3df9ae37ed4bc45083efe35
397
394
2020-11-25T14:23:55Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([http://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
b10489648ed278c1a9b1921582de6d3512c93d2e
398
397
2020-11-25T14:24:45Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([https://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
26475556d344b22490faca7a07c58806b9ab7aad
400
398
2020-11-27T10:34:12Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP 5.x''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([https://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
c2c146bf9ee247d49268b49f1f1711770703aec5
407
400
2020-11-27T11:07:22Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([https://thwien.de thwien.de] - Düsseldorf - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
228f1f61ea269241a33a50a39c81eeaf81a4238a
408
407
2020-11-27T11:08:32Z
Thwien
2
wikitext
text/x-wiki
__NOTOC__
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This project realizes a '''neural network topology called ''[[multilayer perceptron]]'' for PHP''' environments. The source code is based on a work by ''Eddy Young'' in 2002. Several improvements and changes on this implementation are done by ''Thomas Wien'' ([https://thwien.de thwien.de] - Ratingen - Germany) since 2007. You will find the PHP source in the section [[Download]]. Please, consider the [[Copyright]]. To get a short idea what is the benefit of neural networks have a look at page [[Neural Networks]].
== Overview ==
* [[Neural Networks]]
* [[Multilayer perceptron|Multilayer Perceptron]]
* [[Download]]
* [[Installation]]
* [[Examples]]
* [[Development]]
* [[FAQ]]
* [[Copyright]]
== Features ==
* Output type detection (linear or binary)
* Logging (weights and network errors)
* Client-Server model for distributed applications
* Graphical network topology as PNG image
* Displaying network details
* String association
* Classification
* Phar support (as of PHP 5.3.0)
== Versions and Change-Log ==
'''Version 2.x.x by Thomas Wien''' (Development)
* Image matrix support
'''Version 2.3.0 by Thomas Wien''' (2012-12-13) [[Download]]
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
* PHP 5.4.x or above
'''Version 2.2.3 by Thomas Wien''' (2012-12-12) [[Download]]
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
'''Version 2.2.2 by Thomas Wien''' (2011-07-04)
* Redesign of network details
* Considering cpu limits for calculation of network execution time
'''Version 2.2.1 by Thomas Wien''' (2011-06-15)
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
'''Version 2.2.0 by Thomas Wien''' (2011-06-01)
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.7 by Thomas Wien''' (2011-06-15) [[Download]]
* Bug fix: Wrong output type detection in binary networks
'''Version 2.1.6 by Thomas Wien''' (2011-06-01)
* Bug fix: Wrong output type detection in some circumstances
'''Version 2.1.5 by Thomas Wien''' (2011-05-24)
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
'''Version 2.1.4 by Thomas Wien''' (2011-05-23)
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
'''Version 2.1.3 by Thomas Wien''' (2010-01-06)
* Date input support class
'''Version 2.1.2 by Thomas Wien''' (2009-12-26)
* Classification support
* Phar support (as of PHP 5.3.0)
'''Version 2.1.1 by Thomas Wien''' (2009-12-23)
* String association support
'''Version 2.1.0 by Thomas Wien''' (2009-12-22)
* Checking parameter counts on ANN_Values::input() and ANN_Values::output()
* Removing protected method ANN_Neuron::getInputs()
* Fixing bug: Error tolerance calculation in ANN_Network::isTrainingComplete()
* Switching to Git version control
* Moving all experimental code into branch
* Removing all experimental code from master branch (due to performance and future development)
'''Version 2.0.7 by Thomas Wien''' (2009-01-01) [[Download]]
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
'''Version 2.0.6 by Thomas Wien''' (2008-12-18)
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
'''Version 2.0.5 by Thomas Wien''' (2008-12-16)
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparison in ANN_InputValue and ANN_OutputValue
'''Version 2.0.4 by Thomas Wien''' (2008-01-27)
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
'''Version 2.0.3 by Thomas Wien''' (2008-01-17)
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
* Fixing bug: runtime error on call of setMomentum()
'''Version 2.0.2 by Thomas Wien''' (2008-01-14)
* Client-Server model for distributed applications
* Calculating total network error for csv logging
'''Version 2.0.1 by Thomas Wien''' (2008-01-06)
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
'''Version 2.0.0 by Thomas Wien''' (2007-12-17)
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Hyperbolic tangent transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
'''Version 1.0 by Eddy Young''' (2002)
* Initial version
== Todo ==
* More Examples
* Performance check depending on host system
* Wiki: More details to installation and use
* PHPDoc: More details to documentation
* Improving license agreement of source code
* Adding error codes to exceptions
* Exception if network error does not reach minimum
13ae94297011cddeeff5deb64e3f4b7677f0d4ba
Installation
0
3
375
347
2011-06-15T07:09:08Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann221*
* Unpack the source code
>tar -xzf ann221.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann220.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
e8acec67954b2dd712ec179b15f1889ac489f1c7
389
375
2012-12-12T21:02:07Z
Thwien
2
/* Installation */ New Version
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann223*
* Unpack the source code
>tar -xzf ann223.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann223.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
48fbb5d4171ec604768b55382a79a676248a741b
390
389
2012-12-12T21:06:42Z
Thwien
2
/* Performance issues */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann223*
* Unpack the source code
>tar -xzf ann223.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann223.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
* Use PHP 5.4.x (much faster than PHP 5.3.x)
ccac5e9ca146de4a1dae88ccacf9653dad32c7cd
395
390
2012-12-13T21:22:42Z
Thwien
2
/* Installation */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.3.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann230*
* Unpack the source code
>tar -xzf ann230.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann230.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
* Use PHP 5.4.x (much faster than PHP 5.3.x)
38cf7d1626039e694cafdae06d66d830e7d1f0d1
396
395
2012-12-13T21:23:12Z
Thwien
2
/* Requirements */
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP 5.x ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.4.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann230*
* Unpack the source code
>tar -xzf ann230.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann230.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
* Use PHP 5.4.x (much faster than PHP 5.3.x)
d303cbbbf0a995b036db92627fd106a9d020da25
401
396
2020-11-27T10:34:54Z
Thwien
2
wikitext
text/x-wiki
== ANN - Artificial Neural Network for PHP ==
This chapter describes the steps to implement the ANN source code to your project.
== Documentation ==
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
== Requirements ==
The latest implementation requires a php environment running '''PHP 5.4.x''' or above. If using the client-server mechanism of the network class, the '''curl extension''' should be available on the client host which connects to an ANN server. If using class ''ANN_NetworkGraph'' '''GD library with png support''' should be installed.
== Installation ==
* [[Download]] the source code
* Checking integrity
>md5sum ann230*
* Unpack the source code
>tar -xzf ann230.tar.gz
* Copy the directory ANN to your library directory of your project.
* Including to your source
<source lang="php">
<?php
require_once 'ANN/Loader.php';
use ANN\Network;
$objNetwork = new Network;
</source>
'''or''' as phar library (supported as of PHP 5.3.0)
<source lang="php">
<?php
require_once 'phar://ann230.phar.gz';
use ANN\Network;
$objNetwork = new Network;
</source>
* Learn to use the library. Have a look to chapter [[Examples]].
* For further information e.g. about dat-files have a view to the [[FAQ]] page.
== Performance issues ==
* Install Zend Optimizer (PHP version before 5.3.0)
* Do not use any debugger module like xdebug
* Do not use any profiling tool
* Do not set "max_execution_time = 0" in your php.ini or .htaccess file
* Running PHP on Linux console use "php -d max_execution_time=60 neural.php"
* Run implicit ini_set() call to set "max_execution_time" by runtime
* Use PHP 5.3.x
* Use PHP 5.4.x (much faster than PHP 5.3.x)
7897df9e2d77c9402fcf0ff181a2b91360dcad86
Download
0
2
376
373
2011-06-15T07:12:39Z
Thwien
2
/* Version 2.2.0 (2011-06-01) obsolete */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7f5327d35d7b6f0dec2cee5065ea4a70 ann221.phar.gz
* 4c7a2ec74b1c9c14bd1d1fad8d71d62d ann221.tar.gz
* e6c90eb4422678306480be2cf2fca7b6 ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* 1bdbe4a8dcafb50c7e95289807b77b71 ann217.phar.gz
* 1ce821177150650f4e0675f02c33ad05 ann217.tar.gz
* d88ebee3d476bf488dec448d38139688 ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
9b2f15314373511087d44d9c21b3d0aea8f0efec
377
376
2011-06-15T07:54:22Z
Thwien
2
/* Version 2.2.1 (2011-06-15) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* 1bdbe4a8dcafb50c7e95289807b77b71 ann217.phar.gz
* 1ce821177150650f4e0675f02c33ad05 ann217.tar.gz
* d88ebee3d476bf488dec448d38139688 ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
71a3d50cfb442c3648571616939f768ceca80336
378
377
2011-06-15T07:54:51Z
Thwien
2
/* Version 2.1.7 (2011-06-15) stable (PHP 5.2) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
915eeee24c2d9ebcddc9da3f8e442630e3b7e136
380
378
2011-06-15T07:57:18Z
Thwien
2
/* Version 2.2.1 (2011-06-15) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ==
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version '''1.0''' (2002) ==
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
0fd6f4674656812ac7f87ca9bd441c9ccf2bc58e
381
380
2011-06-18T08:55:23Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://freebsd.mu/freebsd/archives/000039.html Project page on freebsd.mu]
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
0feb1cdbf0ce1f9f83d0615004507b1521f4ad49
382
381
2011-06-18T08:57:03Z
Thwien
2
/* Version 1.0 (2002) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
eb62200bfc84a4d88c8aabdb7769a896e9674c69
383
382
2011-07-04T08:24:21Z
Thwien
2
/* Version 2.2 */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* ea207fa528c6aaf5ebab9ab4b8d2dd5c ann222.phar.gz
* ed5e8a8c9cd0582cf118681cf6b22a11 ann222.tar.gz
* 8fc2a7c3816fee0b9f7ea92a40ff14a8 ann222.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
1fd9f7af5554c5d1becef76eefc3642ee4510f1b
385
383
2011-07-04T08:27:11Z
Thwien
2
/* Version 2.2 */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* ea207fa528c6aaf5ebab9ab4b8d2dd5c ann222.phar.gz
* ed5e8a8c9cd0582cf118681cf6b22a11 ann222.tar.gz
* 8fc2a7c3816fee0b9f7ea92a40ff14a8 ann222.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
048a10bfced919db5af9537bae2662e276113b50
386
385
2011-07-04T19:33:13Z
Thwien
2
/* Version 2.2.2 (2011-07-04) stable (PHP 5.3 or above) */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
244340b74eabd3374fbac0c882b6e7b581e47b89
387
386
2012-12-12T20:56:24Z
Thwien
2
/* Version 2.2 */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.2 ==
=== Version '''2.2.3''' (2012-12-12) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann223.zip Download - ann223.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann223.tar.gz Download - ann223.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann223.phar.gz Download (Phar) - ann223.phar.gz] (22 KB)
'''MD5 finger prints'''
* 6e4f100df2d3bbb99753b224c1348fdf ann223.phar.gz
* ae3ad1cc66f8803c7bf863702f90ba67 ann223.tar.gz
* 8e6f6de3b747f42450d27fff70aa2514 ann223.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann223.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
232dc1cab5f40ae812d46b6ededef96e9b8f7101
393
387
2012-12-13T21:16:06Z
Thwien
2
/* Version 2.3 */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.3 ==
=== Version '''2.3.0''' (2012-12-13) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.4 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann230.zip Download - ann230.zip] (47 KB)
* [http://ann.thwien.de/downloads/ann230.tar.gz Download - ann230.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann230.phar.gz Download (Phar) - ann230.phar.gz] (22 KB)
'''MD5 finger prints'''
* fd5d3ded9761dd745621ab06460e3e21 ann230.phar.gz
* dd1e604c7e16e666a98dc51f6dc38507 ann230.tar.gz
* 3ee3741ee830d4a5a095f122e5858ae6 ann230.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann230.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
== Version 2.2 ==
=== Version '''2.2.3''' (2012-12-12) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann223.zip Download - ann223.zip] (47 KB)
* [http://ann.thwien.de/downloads/ann223.tar.gz Download - ann223.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann223.phar.gz Download (Phar) - ann223.phar.gz] (22 KB)
'''MD5 finger prints'''
* 6e4f100df2d3bbb99753b224c1348fdf ann223.phar.gz
* ae3ad1cc66f8803c7bf863702f90ba67 ann223.tar.gz
* 8e6f6de3b747f42450d27fff70aa2514 ann223.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann223.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [http://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [http://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [http://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [http://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [http://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [http://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [http://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [http://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [http://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [http://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [http://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [http://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [http://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [http://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [http://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [http://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [http://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [http://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [http://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [http://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [http://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [http://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [http://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [http://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [http://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [http://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
37e55536f1f40e6fa147354892eeb099d21149c3
399
393
2020-11-27T08:16:15Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.3 ==
=== Version '''2.3.0''' (2012-12-13) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.4 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann230.zip Download - ann230.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann230.tar.gz Download - ann230.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann230.phar.gz Download (Phar) - ann230.phar.gz] (22 KB)
'''MD5 finger prints'''
* fd5d3ded9761dd745621ab06460e3e21 ann230.phar.gz
* dd1e604c7e16e666a98dc51f6dc38507 ann230.tar.gz
* 3ee3741ee830d4a5a095f122e5858ae6 ann230.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann230.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
== Version 2.2 ==
=== Version '''2.2.3''' (2012-12-12) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann223.zip Download - ann223.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann223.tar.gz Download - ann223.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann223.phar.gz Download (Phar) - ann223.phar.gz] (22 KB)
'''MD5 finger prints'''
* 6e4f100df2d3bbb99753b224c1348fdf ann223.phar.gz
* ae3ad1cc66f8803c7bf863702f90ba67 ann223.tar.gz
* 8e6f6de3b747f42450d27fff70aa2514 ann223.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann223.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [https://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [https://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [https://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [https://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [https://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [https://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [https://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [https://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [https://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [https://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [https://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [https://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [https://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [https://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [https://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [https://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
ccdb17f9a208f30c8444796464633467c4b64609
405
399
2020-11-27T10:41:22Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This page offers to download the current stable version of ANN implementation for PHP 5.x. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.3 ==
=== Version '''2.3.0''' (2012-12-13) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.4 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann230.zip Download - ann230.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann230.tar.gz Download - ann230.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann230.phar.gz Download (Phar) - ann230.phar.gz] (22 KB)
'''MD5 finger prints'''
* fd5d3ded9761dd745621ab06460e3e21 ann230.phar.gz
* dd1e604c7e16e666a98dc51f6dc38507 ann230.tar.gz
* 3ee3741ee830d4a5a095f122e5858ae6 ann230.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann230.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
== Version 2.2 ==
=== Version '''2.2.3''' (2012-12-12) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann223.zip Download - ann223.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann223.tar.gz Download - ann223.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann223.phar.gz Download (Phar) - ann223.phar.gz] (22 KB)
'''MD5 finger prints'''
* 6e4f100df2d3bbb99753b224c1348fdf ann223.phar.gz
* ae3ad1cc66f8803c7bf863702f90ba67 ann223.tar.gz
* 8e6f6de3b747f42450d27fff70aa2514 ann223.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann223.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [https://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [https://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [https://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [https://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [https://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [https://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [https://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [https://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [https://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [https://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [https://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [https://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [https://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [https://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [https://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [https://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
7afb18afdef565b701ce1fbd29bed1400de7faa0
406
405
2020-11-27T10:41:44Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This page offers to download the current stable version of ANN implementation for PHP. Go to [[Installation]] section to get information about requirements and on how to implement these PHP libraries into your project.
== Version 2.3 ==
=== Version '''2.3.0''' (2012-12-13) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.4 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann230.zip Download - ann230.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann230.tar.gz Download - ann230.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann230.phar.gz Download (Phar) - ann230.phar.gz] (22 KB)
'''MD5 finger prints'''
* fd5d3ded9761dd745621ab06460e3e21 ann230.phar.gz
* dd1e604c7e16e666a98dc51f6dc38507 ann230.tar.gz
* 3ee3741ee830d4a5a095f122e5858ae6 ann230.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann230.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Using traits for performance reasons
* Checking php version for compatibility with ANN library
== Version 2.2 ==
=== Version '''2.2.3''' (2012-12-12) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.3 or above)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann223.zip Download - ann223.zip] (47 KB)
* [https://ann.thwien.de/downloads/ann223.tar.gz Download - ann223.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann223.phar.gz Download (Phar) - ann223.phar.gz] (22 KB)
'''MD5 finger prints'''
* 6e4f100df2d3bbb99753b224c1348fdf ann223.phar.gz
* ae3ad1cc66f8803c7bf863702f90ba67 ann223.tar.gz
* 8e6f6de3b747f42450d27fff70aa2514 ann223.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann223.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bugfix: Division by zero if training time below one second
* Adding php version and sapi interface to network information
* Test running on PHP 5.4
=== Version '''2.2.2''' (2011-07-04) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann222.zip Download - ann222.zip] (46 KB)
* [https://ann.thwien.de/downloads/ann222.tar.gz Download - ann222.tar.gz] (21 KB)
* [https://ann.thwien.de/downloads/ann222.phar.gz Download (Phar) - ann222.phar.gz] (22 KB)
'''MD5 finger prints'''
* 20b3123169666f0b245b172126e5da96 ann222.phar.gz
* 56b3bad2cf8850aa13f10a5c4ae38cd9 ann222.tar.gz
* 2a12e0b8baf89135db35ae5aee1d1fe7 ann222.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann222.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Redesign of network details
* Considering cpu limits for calculation of network execution time
=== Version '''2.2.1''' (2011-06-15) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann221.zip Download - ann221.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann221.tar.gz Download - ann221.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann221.phar.gz Download (Phar) - ann221.phar.gz] (21 KB)
'''MD5 finger prints'''
* 7a366b40c05bb142f56412696dc86592 ann221.phar.gz
* f892a716b59ba77bc073d4e48a3c70e7 ann221.tar.gz
* 1311691e545ee549bfc4a67a8211ea4f ann221.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann221.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
* Using Interface called \ANN\InterfaceLoadable to make easier decision of loadable objects.
=== Version '''2.2.0''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann220.zip Download - ann220.zip] (39 KB)
* [https://ann.thwien.de/downloads/ann220.tar.gz Download - ann220.tar.gz] (19 KB)
* [https://ann.thwien.de/downloads/ann220.phar.gz Download (Phar) - ann220.phar.gz] (21 KB)
'''MD5 finger prints'''
* d25f4b817539cdcb404b93c61fe76f47 ann220.phar.gz
* 6599d8bbbdeae02b0ffbbf5d7cb3b426 ann220.tar.gz
* a79e15ecd9a81038982affd4f8a1cc51 ann220.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann220.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction of namespaces as of PHP 5.3
* Dynamic learning rate
* \ANN\Network::setLearningRate() is protected now
* Bug fix: Wrong output type detection in some circumstances
== Version 2.1 ==
=== Version '''2.1.7''' (2011-06-15) '''''<span style="color: #2E8B57">stable</span>''''' <span style="color: #FF8C00">(PHP 5.2)</span> ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann217.zip Download - ann217.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann217.tar.gz Download - ann217.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann217.phar.gz Download (Phar) - ann217.phar.gz] (20 KB)
'''MD5 finger prints'''
* e57c2dd57f657e75c6b69dec816ebbb3 ann217.phar.gz
* 6dea816713cb3e8b8e9dccf581dc9cf7 ann217.tar.gz
* 15a6080e0376c7aae18d079f44d3f28c ann217.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann217.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in binary networks
=== Version '''2.1.6''' (2011-06-01) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann216.zip Download - ann216.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann216.tar.gz Download - ann216.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann216.phar.gz Download (Phar) - ann216.phar.gz] (20 KB)
'''MD5 finger prints'''
* 4acbdb23bed762438a0c06c4f23f6a38 ann216.phar.gz
* 7cb3a1b1e17e1272dc02fa7fcbc7d964 ann216.tar.gz
* 9168c4f62693225d9a2785d7a72524de ann216.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann216.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Bug fix: Wrong output type detection in some circumstances
=== Version '''2.1.5''' (2011-05-24) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann215.zip Download - ann215.zip] (37 KB)
* [https://ann.thwien.de/downloads/ann215.tar.gz Download - ann215.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann215.phar.gz Download (Phar) - ann215.phar.gz] (20 KB)
'''MD5 finger prints'''
* 96ad7c92b08a44d89fd4a7a17b463a9e ann215.phar.gz
* 88ffad519058881f90eb24b4363d0d94 ann215.tar.gz
* 2d293c4f92d0d17f3e0e5ba8e79e3dbd ann215.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann215.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Dividing method ANN_Math::random() in ANN_Math::randomDelta() and ANN_Math::randomWeight()
* Better implementation of printing network details including __invoke() und __toString() converting
=== Version '''2.1.4''' (2011-05-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann214.zip Download - ann214.zip] (36 KB)
* [https://ann.thwien.de/downloads/ann214.tar.gz Download - ann214.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann214.phar.gz Download (Phar) - ann214.phar.gz] (20 KB)
'''MD5 finger prints'''
* 998d377de058b959c5ad83141b168e5e ann214.phar.gz
* 12b7a028021477555613d533410526e0 ann214.tar.gz
* e6762c5f667e1710ac7efdca70cb7c41 ann214.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann214.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Better calculation of remaining time of running the network
* Fixing bug: generating random delta not correct
* Adding momentum value
* Simplified ANN_Neuron::adjustWeights()
* Remove possible wrong calculation in ANN_Neuron::adjustWeights() on linear networks
* Simplify ANN_Layer::calculateHiddenDeltas()
=== Version '''2.1.3''' (2010-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann213.zip Download - ann213.zip] (34 KB)
* [https://ann.thwien.de/downloads/ann213.tar.gz Download - ann213.tar.gz] (18 KB)
* [https://ann.thwien.de/downloads/ann213.phar.gz Download (Phar) - ann213.phar.gz] (20 KB)
'''MD5 finger prints'''
* ae8c5eb97e1984c836df53cfffb06294 ann213.phar.gz
* 0a375525863eb3f9663b64655bb7b637 ann213.tar.gz
* 981709c7da085a17994cb71ee603f9e0 ann213.zip
'''Documentation'''
* [https://ann.thwien.de/downloads/ann213.phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to date input support class
=== Version '''2.1.2''' (2009-12-26) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann212.zip Download - ann212.zip] (31 KB)
* [https://ann.thwien.de/downloads/ann212.tar.gz Download - ann212.tar.gz] (16 KB)
* [https://ann.thwien.de/downloads/ann212.phar.gz Download (Phar) - ann212.phar.gz] (18 KB)
'''MD5 finger prints'''
* 0a0e350a56941a99bbf55537978e0bdb ann212.phar.gz
* 3cdea04e49898cdb2f0ab66864d4b4ad ann212.tar.gz
* 8e63c8b89293ee43337bfac0ee4fa67c ann212.zip
'''Documentation'''
* [https://ann.thwien.de/phpdoc/ Documentation (HTML online)]
* [https://ann.thwien.de/downloads/ann212_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to classification with ANN_Classification (Example see: [[Detection of language with classification]])
* Phar support (as of PHP 5.3.0)
=== Version '''2.1.1''' (2009-12-23) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann211.zip Download - ann211.zip] (29 KB)
* [https://ann.thwien.de/downloads/ann211.tar.gz Download - ann211.tar.gz] (16 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann211_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Introduction to string association with ANN_StringValue (Example see: [[Detection of language]])
=== Version '''2.1.0''' (2009-12-22) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann210.zip Download - ann210.zip] (27 KB)
* [https://ann.thwien.de/downloads/ann210.tar.gz Download - ann210.tar.gz] (15 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann210_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Default learning rate to 0.7
* Code changes referring profiling
* Change printing network details formatting
* Remove trailing php end tag
* Code-Standard
* Remove momentum, precision and unused math methods
* Remove unused methods
* Remove error weight derivative
* Removing weight decay
* Removing dynamic learning rate
* Removing algorithm switches and experimental algorithms. Just standard back propagation algorithm used
* Renaming class file names
* Learning rate and delta using
* Using learning rate
* Rounding of network error value
== Version 2.0 ==
=== Version '''2.0.7''' (2009-01-01) '''''<span style="color: #2E8B57">stable</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann207.zip Download - ann207.zip] (32 KB)
* [https://ann.thwien.de/downloads/ann207.tar.gz Download - ann207.tar.gz] (34 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann207_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Removing protected method ANN_Neuron::setOutput()
* Removing protected unused method ANN_Layer::getInputs()
* Removing protected unused property ANN_Layer::$arrInputs
* More detailed exceptions to ANN_Filesystem::saveToFile()
* Different distribution of activation calls across the layers
* Different adjustments in ANN_Neuron::adjustWeights() depending on output type
* Removing static local variables from ANN_Network::getNextIndexInputsToTrain()
* Increasing math precision
* Using class constants for output types (increasing performance)
* Fixing bug: ANN_Neuron::getOutput() is float and not array
=== Version '''2.0.6''' (2008-12-18) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann206.zip Download - ann206.zip] (30 KB)
* [https://ann.thwien.de/downloads/ann206.tar.gz Download - ann206.tar.gz] (33 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann206_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Printing network details of output differences to their desired values
* Complete rewritten code standard of variables
* New class ANN_Values for defining input and output values
* Code examples to phpdoc
* Internal math precision defaults to 5
=== Version '''2.0.5''' (2008-12-16) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann205.zip Download - ann205.zip] (28 KB)
* [https://ann.thwien.de/downloads/ann205.tar.gz Download - ann205.tar.gz] (31 KB)
'''Documentation'''
* [https://ann.thwien.de/downloads/ann205_phpdoc.zip Documentation (HTML download, zip compressed)]
'''Change-Log'''
* Adjustable output error tolerance between 0 and 10 per cent
* Internal rounding of floats for performance issues
* Loading class for all ANN classes (SPL autoload)
* Renaming filename of ANN_Maths class
* Improving code standard
* Fixing bug: Comparision in ANN_InputValue and ANN_OutputValue
=== Version '''2.0.4''' (2008-01-27) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann204.zip Download - ann204.zip] (25 KB)
* [https://ann.thwien.de/downloads/ann204.tar.gz Download - ann204.tar.gz] (29 KB)
'''Change-Log'''
* Weight decay
* QuickProp algorithm (experimental)
* RProp algorithm (experimental)
* Linear saturated activation function (experimental)
* Individual learning rate algorithm (experimental)
* Reducing of overfitting (no training if input pattern produces desired output)
* Increasing performance on activation
* Increasing performance on testing all patterns to their desired outputs
* Increasing performance on calculating hidden deltas
* Increasing performance by defining layer relation by construction
* More details to printNetwork()
* Fixing bug: learning rate is not part of saved delta value
=== Version '''2.0.3''' (2008-01-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann203.zip Download - ann203.zip] (22 KB)
* [https://ann.thwien.de/downloads/ann203.tar.gz Download - ann203.tar.gz] (20 KB)
'''Change-Log'''
* Support for dynamic learning rate
* Automatic epoch determination
* Automatic output type detection
* Shuffling input patterns each epoch instead of randomized pattern access
* Bug fix: runtime error on call of setMomentum()
* Logging of network errors
* Logging on each epoch instead of each training step
* Avoiding distributed internal calls of setMomentum() and setLearningRate()
* Extending display of network details
=== Version '''2.0.2''' (2008-01-14) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann202.zip Download - ann202.zip] (21 KB)
* [https://ann.thwien.de/downloads/ann202.tar.gz Download - ann202.tar.gz] (17 KB)
'''Change-Log'''
* Client-Server model for distributed applications
* Calculating total network error for csv logging
=== Version '''2.0.1''' (2008-01-06) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann201.zip Download - ann201.zip] (19 KB)
* [https://ann.thwien.de/downloads/ann201.tar.gz Download - ann201.tar.gz] (16 KB)
'''Change-Log'''
* Separation of classes to several files
* Version control by Subversion
* Performance issues
* Graphical output of neural network topology
* Logging of weights to csv file
=== Version '''2.0.0''' (2007-12-17) '''''<span style="color: #CD5C5C">obsolete</span>''''' ===
'''Author: Thomas Wien'''
* [https://ann.thwien.de/downloads/ann200.zip Download - ann200.zip] (6 KB)
* [https://ann.thwien.de/downloads/ann200.tar.gz Download - ann200.tar.gz] (6 KB)
'''Change-Log'''
* PHP 5.x support
* PHPDoc documentation
* Momentum support
* Avoiding network overfitting
* Linear / binary output
* ANN_InputValue + ANN_OutputValue classes
* Exceptions
* Threshold function
* Tangens hyperbolicus transfer function
* Several performance issues
* Avoiding array_keys() & srand() due to performance
* Changes in saving and loading network
* Printing network details to browser
* Fixing bug: initializing inputs to all hidden layers
* Fixing bug: training for first hidden layer was skipped
== Version 1.0 ==
=== Version '''1.0''' (2002) ===
'''Author: Eddy Young'''
* [https://ann.thwien.de/downloads/ann100.zip Download - ann100.zip] (6 KB)
'''Change-Log'''
* Initial version
0eaf428313fab9ddf85ffe3aca3452a25d4fadb0
FAQ
0
19
391
362
2012-12-12T21:22:48Z
Thwien
2
wikitext
text/x-wiki
== What are the dat-files? ==
A dat-file is an auto-generated file. Its contents is a serialized structure of the saved object. It will be generated while training the neural network by using '''\ANN\InputValue::saveToFile()''' or '''\ANN\Network::saveToFile()''' and can be reloaded into the running network by '''\ANN\InputValue::loadFromFile()''' or '''\ANN\InputValue::loadFromFile()'''.
The dat-files are not included to the example code downloads because they are auto-generated.
To save the dat-files the directory you are saving the files should have write permission to the PHP process running.
For example if your PHP script is running as a PHP module by Apache and the Apache is running as user ''www-data'', so you can use the following code to set the permissions.
Change to the directory where your own ANN script is stored.
>cd <PROJECTDIR-OF-YOUR-ANN>
Create a subdirectory for all your dat-files.
>mkdir dats
Change the group owner of this subdirectory to ''www-data''.
>chgrp www-data dats
Change the UNIX permissions to the group for write access to this subdirectory.
>chmod g+w dats
Due to security remove all ''others'' permissions in this case.
>chmod o-rwx dats
List your permissions.
>ls -la dats
drwxrwx--- 7 user www-data 4.0K 2009-10-28 08:52 dats
Use this subdirectory in your PHP scripts:
<source lang="php">
require_once 'ANN/Loader.php';
use ANN\Network;
use ANN\Values;
try
{
$objNetwork = Network::loadFromFile('dats/xor.dat');
}
catch(Exception $e)
{
print 'Creating a new one...';
$objNetwork = new Network;
$objValues = new Values;
$objValues->train()
->input(0,0)->output(0)
->input(0,1)->output(1)
->input(1,0)->output(1)
->input(1,1)->output(0);
$objValues->saveToFile('dats/values_xor.dat');
unset($objValues);
}
try
{
$objValues = Values::loadFromFile('dats/values_xor.dat');
}
catch(Exception $e)
{
die('Loading of values failed');
}
$objNetwork->setValues($objValues); // to be called as of version 2.0.6
$boolTrained = $objNetwork->train();
print ($boolTrained)
? 'Network trained'
: 'Network not trained completely. Please re-run the script';
$objNetwork->saveToFile('dats/xor.dat');
$objNetwork->printNetwork();
</source>
== How to adjust the network's tolerance down? ==
The default error tolerance of the neural network is set to 0.02 if running a linear network. To increase tolerance use the following code.
<source lang="php">
$objNetwork->setOutputErrorTolerance(0.1);
</source>
857379f6d7fa2b60eff9d23a9bfcd63aa34bdedc
Copyright
0
4
392
361
2012-12-12T21:23:20Z
Thwien
2
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<code>
* Artificial Neural Network - Version 2.2
*
* For updates and changes visit the project page at http://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* The BSD 2-Clause License
*
* http://opensource.org/licenses/bsd-license.php
*
* Copyright (c) 2002, Eddy Young
* Copyright (c) 2007 - 2012, Thomas Wien
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version 2.2 by Thomas Wien
* @copyright Copyright (c) 2002 by Eddy Young
* @copyright Copyright (c) 2007-2012 by Thomas Wien
* @package ANN
</code>
57f3f8d10d674c1973e4bc0bc504f5ebcc7ef8b1
409
392
2020-11-27T14:06:08Z
Thwien
2
wikitext
text/x-wiki
The copyright conditions are included in the source files.
<source>
* Artificial Neural Network for PHP
*
* For updates and changes visit the project page at https://ann.thwien.de/
*
*
*
* <b>LICENCE</b>
*
* The BSD 2-Clause License
*
* http://opensource.org/licenses/bsd-license.php
*
* Copyright (c) 2002, Eddy Young
* Copyright (c) 2007 - 2020, Thomas Wien
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
* @author Eddy Young <jeyoung_at_priscimon_dot_com>
* @author Thomas Wien <info_at_thwien_dot_de>
* @version ANN Version 1.0 by Eddy Young
* @version ANN Version >=2.x by Thomas Wien
* @copyright Copyright (c) 2002 by Eddy Young
* @copyright Copyright (c) 2007-2020 by Thomas Wien
* @package ANN
</source>
f4ee60eb719fe982ea2d09cb61c4f1bfab287d2c
Development
0
25
402
308
2020-11-27T10:39:03Z
Thwien
2
/* Public ANN Git Repository */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to participate on the development of ANN implementation for PHP 5.x. Please feel free to clone this git repository, do enhancements on the code and send your repository or patch files to info_at_thwien_dot_de.
== Public ANN Git Repository ==
>git clone https://ann.thwien.de/ann.git
== Rules for developers ==
* Write code, don't copy it from others
* Consider the licence
* Use the same code standard
* Test your code changes before sending patches
* Include examples to the example directory
* Use php doc comments
47a84cd3c6a3d88183d9a2c8009220d0a8dfcee3
403
402
2020-11-27T10:39:15Z
Thwien
2
/* Public ANN Git Repository */
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP 5.x'''</big>
This page offers to participate on the development of ANN implementation for PHP 5.x. Please feel free to clone this git repository, do enhancements on the code and send your repository or patch files to info_at_thwien_dot_de.
== Public ANN Git Repository ==
# git clone https://ann.thwien.de/ann.git
== Rules for developers ==
* Write code, don't copy it from others
* Consider the licence
* Use the same code standard
* Test your code changes before sending patches
* Include examples to the example directory
* Use php doc comments
f036215d81b77190eb370156782a56ee3a3ab273
404
403
2020-11-27T10:39:29Z
Thwien
2
wikitext
text/x-wiki
<big>'''ANN - Artificial Neural Network for PHP'''</big>
This page offers to participate on the development of ANN implementation for PHP. Please feel free to clone this git repository, do enhancements on the code and send your repository or patch files to info_at_thwien_dot_de.
== Public ANN Git Repository ==
# git clone https://ann.thwien.de/ann.git
== Rules for developers ==
* Write code, don't copy it from others
* Consider the licence
* Use the same code standard
* Test your code changes before sending patches
* Include examples to the example directory
* Use php doc comments
76fcf74a8a48fdda27d62a19235457f20d0190da